| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions. Connect your Gmail, DriveDropbox, and Slack accounts and in less than 2 minutes, Dokkio will automatically organize all your file attachments. Learn more and claim your free account.

View
 

pacinabriefing4

Page history last edited by FOLIO Team 11 years ago

 

FOLIO

Facilitated Online Learning as an Interactive Opportunity

 

 


 

 

Planning and Conducting Information Needs Analysis (PACINA)

 

Briefing 4

 February 2005

 

Choosing methodology: What to ask and how to answer it

 

Introduction

 

As with any project, the key to running a successful Information Needs Analysis (INA) is good planning. Although for some self contained projects, a detailed knowledge of the research base is a luxury, a literature review to see what is already known about the needs of your own user group (or failing an exact match, similar users) may be useful, especially articles discussing the relative merits of differing methodologies. In addition to providing background information and a summary of existing knowledge, reviewing the literature may unearth tools (standardised questionnaires etc) which your new study can use or adapt for its own data collection. In this briefing we explore two key issues:

 

  • What question is your INA setting out to answer?
  • What method will you use for data collection?

 

In the next briefing we will look at more “managerial” issues around getting the audit started.

 

What are you going to do?

 

The outcome of these preparatory stages should be a water-tight and answerable question: what do you want to find out? In refining your question, you need to address four issues:

 

Background to the study: the question must be framed to reflect the service's functions (archival, cultural, educational, recreational, research-support); mission; and user perspectives.  Traditionally, user studies can take  six perspectives:

 

  1. Demands users place on the library and information services

  2. Awareness of the services among the “user community” of both current and potential users.

  3. User satisfaction with those services.

  4. The importance users attach to various services the library and information service already provides, or might consider providing at some stage in the future.

  5. Demographic changes in the user community which might affect future demands.

  6. Personal interests  and activities of members of the user community.

 

Questions you specifically require an answer to: this includes basic demographic information about the users (age, professional group); their preferences and attitudes (given a choice, which services and materials will users opt for); their behaviour (is the prime purpose of the users’ visits work, personal development, study, or  research orientated).

 

Other questions that may probably arise: these may cover local issues, such as other providers, trends in information dissemination (e.g. university wide distribution systems)’

 

Other questions that may possibly arise: these cover areas like user satisfaction, gaps in the services currently offered, identifying the organisation’s strengths and weaknesses as perceived by its users, options for future development: would users be willing to pay for “value-added” services, or if there is a choice, which services could be run down to free up funds for new ones.

 

How are you going to do it?

 

Data comes in various forms, and you need to be aware of the following dichotomies:

 

Retrospective versus prospective: retrospective analyses take data that has already been collected and is readily to hand. In a library situation, such data might include records of borrowing, inter-library loans and training sessions delivered in the library. It has the advantage of being already available (so you can often skip data collection and proceed directly to analysis) and cheap, or even free! However, since it has not been collected with your INA in mind, it may not be ideal: for instance borrowing may merely be monthly totals and not broken down into subject categories. With modern data protection legislation, there may also be problems if the data is “person identifiable.” If it can be traced back to an individual, you may only be able to use it for the purposes the data was originally collected (library administration) and not for other purposes (research or audit).

 

Prospective data is collected in advance: you design a specific tool (see below) to capture exactly the right data for your question, and start collecting it when the project goes live. This has added expenses (you may have to design and print special forms, or devote time to interviews), and analysis cannot proceed very far until you have accumulated most of the responses. When recruiting respondents, you can obtain consent to use their answers for research, so there are fewer data protection issues. Prospective data is thus higher quality, but slower and more expensive to obtain.

 

Qualitative versus quantitative data:  you usually obtain qualitative data (Latin qualis  = what sort?) through techniques like questionnaires where subjects can write answers in their own words, or interviews, where you record responses from the subjects verbatim, and later transcribe these. The responses are thus the words and opinions of your subjects.

Quantitative  (Latin quantus = how much?) is numeric data, produced from direct measurements, or questionnaires which require subjects to select discrete categories, either numerical (How many books a week do you borrow: 1; 2; 3; 4; 5 or more) or  have some degree of ranking (Please state your highest educational qualification: GCSE; NVQ; bachelor’s degree;  master’s degree or Ph D).

 

Traditionally, quantitative data is what “real scientists” produce: it is robust and you can use mathematical and statistical tests to “prove” hypotheses and formulate physical laws, whereas qualitative data is less robust, subjective, derived from opinion, and can only be manipulated by less rigorous statistical methods. However, the numerical coding enforced in quantitative  data loses the richness of experience which allowing respondents to use their own words and phrases generates. In recognising an emerging consensus that one method is not universally better than the other, you should select appropriate methodology for the question you are posing. Qualitative methods, for example, are often best for questions about people’s beliefs and motivation. In practice, an INA will use a combination of both techniques: quantitative questions about frequency of use; qualitative questions about the user's experience; for each service.

 

In collecting the data for an INA, several techniques are available:

 

Direct observation: staff log user behaviour

Exit interviews: approaching users as they leave the library, have books stamped out, collect reservations. The advantage is that you are asking about recent events, but to avoid blocking the organisations’s smooth functioning, any such interviews must be very brief.

Booked interviews: where you make an appointment with the subject for an interview, although there may be trouble finding a mutually convenient time.

Questionnaires: see below.

In-house records: statistics on borrowing, inter-library loans, requests for training, access to CD-ROMs held behind the counter.

Diaries or journals: very detailed information on what a user retrieves and what they do with it. This can provide very in-depth information, but is very time consuming to collect and to analyse.

 

Choosing a methodology

 

Essentially, there are two types of instrument for data collection:

 

Direct personal contact: typically a one-to-one interview between the librarian and the user

Indirect or impersonal methods: such as a postal questionnaire.

 

Personal approaches allow more in-depth exploration. Since the researcher either tapes the interview or makes the notes, the interviewee may provide more information than they would bother to record in a questionnaire. The direct approach allows clarification of issues (such as explanation of acronyms or technical terms), or exploration of new topics which the subject volunteers. You may be able to draw out deeper reflection from your subjects, or they may be willing to offer personal insights which they would not disclose to a questionnaire, although they may be less willing to release personally embarrassing episodes. The time needed to run and transcribe such interviews can be considerable (e.g. eight hours for every one hour of recording), and the success of the interview may depend on finding suitable times and locations, and the establishment of good rapport between interviewer and interviewee. Within an interview, you can use special techniques, such as a critical incident. Ask about an episode when the user experienced either a very successful information outcome or a very unsatisfactory one, and analyse the factors which contributed either to success or failure in that episode.

 

In designing interviews, you may need to think of format:

 

  • Structured, with very specific questions. Structured interviews are appropriate where the interviewer already has a good idea of what the issues are, and needs to get specific answers:

We know that the inter-library loan system does not work very well.

How often do you use inter-library loans?

Do you get what you need from them?

Do they arrive on time?

We have come up with some options: how effective do you think these might be?

 

  • Semi-structured, with leading questions acting as springboards from which the user can offer their own ideas.
  • Unstructured, or open-ended, where the user is given free reign, and the interviewer keeps the process going by asking questions to clarify, or where the narrative falters. Open ended techniques may be more appropriate where the interviewer is still on an exploratory fact-finding mission:

What is your experience of the inter-library loan system, and can you think of any ways we could improve it?

 

Interviews are usually “one-to-one” but another option is to convene a focus group of users, typically about six to twelve in all, each  session running for about forty-five to ninety minutes. Such groups can often uncover a richness of user perspectives, as the group raises and discusses issues. Some members who might not have thought about some issues in one-to-one situations may reflect when someone else raises the point, and offer their view. You can also study interactions, and the relations between different users (students, post-graduate research staff, clinical staff, managers). Focus groups can however be complex to organise, requiring a common time slot that suits a dozen people, generate large amounts of material for transcription, and require careful facilitation, if they are not to degenerate into arguments, where different users have conflicting interests.

 

A more formal technique than focus groups is the Delphi technique. Again, you need a group of informants, but in this case they must be knowledgeable experts (in this case about the available services and other alternatives). As this is not often the case in service user populations, the technique is not widely used in information services. Again, the group has a facilitator, but this time it requires more formal documentation and multiple rounds. At the end of each session, one member writes up the group’s conclusions, and circulates them. Other members can then either agree, disagree, modify or add to these statements. The revised draft goes to the next meeting (or is circulated electronically for e-mail discussion), and is refined. Over several cycles, this produces a consensus from the group.

 

Where there are geographical barriers, telephone interviews can be useful, but with the added expense of the call and special recording equipment, and can be quite complex to set up. You may also find that the interview takes longer than one held in a dedicated interview room, as the subject may have many local distractions.

 

Action research (or obtrusive observation, as some Americans call it) involves the researcher stepping into the subject’s shoes, by observing them searching, using catalogues etc. Often, there is a balance between observing and participating (the interviewer may engage in the subject’s searching), and there may be a “de-briefing” in the form of short interview. Such techniques provide a “reality check” (you can see exactly what the subjects are doing), but the presence of the researcher may somehow alter the subject's behaviour. Traditionally, researchers could engage in covert observation, to reduce their impact on user behaviour, but in the current climate of informed consent and research ethics, engaging library users without their express permission seems problematic.

 

Questionnaires may paradoxically be better at exploring very sensitive issues (e.g. users the subject finds difficult to deal with; gaps in their knowledge or training), if anonymity can be guaranteed. You can collect information from more subjects, and if the form is pre-coded, data entry is much faster. Because you force respondents to select specific options, numerical analysis is easier, but at the expense of the richness of interviews in the subjects' own words. Designing questionnaires which are neither ambiguous nor misleading is quite a skill, and you should always pilot a questionnaire before starting formal data collection.

 

 

 

Recommended reading:

 

Booth A (2003) A quest for questionnaires. Health Information and Libraries Journal  20 (1): 53-56.

 

Bookstein A (1985) Questionnaire  research in a  library setting  Journal of Academic Librarianship  11(1): 24-8

 

Fowler F (1995) Improving  Survey Questions: design and evaluation Thousand Oaks, California: Sage

 

Howard  S, Presser S (1996) Questions and Answers in Attitude Surveys: experiments on question form, wording and context Thousand Oaks, California: Sage

 

Jorgensen D (1989) Participant  Observation: a methodology for human studies Thousand Oaks, California: Sage

 

Kalton G (1983) Introduction to Survey Sampling Newbury Park: Sage

 

Powell R (1997) Basic Research Methods for Librarians  Greenwich, Connecticut: Ablex

 

 

 

 

     

Comments (0)

You don't have permission to comment on this page.