| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions. Connect your Gmail, DriveDropbox, and Slack accounts and in less than 2 minutes, Dokkio will automatically organize all your file attachments. Learn more and claim your free account.

View
 

pacinabriefing5

Page history last edited by FOLIO Team 11 years ago

 

FOLIO

Facilitated Online Learning as an Interactive Opportunity

 

 


 

 

Planning and Conducting Information Needs Analysis (PACINA)

 

Briefing 5

 February 2005

 

Conducting an information needs analysis

 

Introduction

Assuming that you have refined your question and chosen your methodology, you are now probably itching to start collecting some “data.” However, conducting the audit is more than just waving questionnaires at everyone who crosses your threshold: there is still a bit more spadework to do. Before you rush off to start gathering data for INA, you need to think about the following components:

 

Stakeholders: who are the key players you need to get on board? Among those you need to engage are:

  • Library and information service staff, who may have to implement recommendations, or may be key in collecting data
  • The organisational director, who may have to sign off projects

  • Senior managers, who may have to authorise use of resources.

  • Service users, who may be affected by any changes. Remember “invisible” users, who may never encounter desk-staff, but use texts for reference only or who use e-facilities (like on-line catalogues and data-bases) remotely.

 

At this stage, you may need to consolidate your ideas about the INA, by writing a formal proposal, to engage these stakeholders, or to give them something to discuss when you approach them. This should cover the nature of the study, anticipated benefits, goals and any input you need from your stakeholders. In engaging stakeholders, brain-storming sessions may be useful to generate ideas and refine the initial draft proposal.  Later on, brainstorming can be useful in generating ideas for questionnaires, although after the initial creative rush, you may need to put in a lot of work in converting raw ideas into focused questions to acquire the data you need accurately.

 

Ethical or legal  restrictions: you need to think how these may apply to your study, such as if you plan to include at risk or vulnerable groups in the target audience. If your aims include a formal research project, with possible publication, or if your subjects include NHS staff, you may need to approach the local ethics committee, or even make formal application for permission to proceed. At this stage, beware of potential objections to your survey. You must address such concerns rather than glossing over them.

 

Leadership: to succeed in its aims, the needs analysis must have dedicated leadership, either an individual or a coherent team. In smaller units, a “single-handed” approach may be the only option. In larger units a team may be better, but leadership is required for quick decisions (possibly without reaching a consensus, where fast turn-around is needed), with the ability to negotiate for resources.

 

Lines of accountability: there must be clear lines of reporting and accountability, data ownership and dissemination.

 

 

Determine scope and resource allocation

Scope can be defined by the type of information (comprehensive, archives, technological etc) or the coverage of the organisation (organisation wide, or specific departments). If you have not already considered them you should now address the issue of resources and their allocation to achieve your goals. Resources will include the following factors:

  • Human: who has responsibility for the following: collecting the data (interviewing; distribution and return of questionnaires), receipt, cataloguing and inputting of data; data analysis; report writing; dissemination of conclusions and implementation of recommendations or action plan.

  • Physical: provision of interview facilities; production of questionnaires.
  • Financial: costs of printing questionnaires; equipment for recording interviews; cost of transcribing interviews; postage and (if you use telephone interviewing) telephone calls.
  • Technical: do you have such skills as questionnaire design, statistical analysis and report writing “in-house” or will you need to buy in some expertise?

 

In reviewing your resources, and how they may define or restrict the scope of the Information Needs Analysis (INA), the three main areas to assess are:

 

  • Staff skills, knowledge and experience: this may be from your existing knowledge of staff qualifications and interests, but a flyer or e-mail asking for expressions of interest to flush out hidden talent may be useful. A little lateral thinking to bring in administrative staff, voluntary staff etc may be useful here. Specifically, you may need to find if you have “in-house” skills like questionnaire design and facilitating focus groups. If you identify specific gaps, you may need to consider training staff or buying-in some professional services from outside.

  • Time available to complete the project: this may necessitate revising work-schedules, or deferring routine tasks. You must also devise a realistic timetable for the project. Techniques like Gantt Charts, with time-lines for each process, identifying crucial landmarks and showing overlap, or where one process is dependent on another, can be very useful here.

  • Funding and materials: this may include printing questionnaires, incentives to participate (small prizes, or a draw for book tokens), telephone bills, recording equipment. At this stage, you may need to convert an allocated budget into specific items of expenditure.

 

Designing your data collection tool

 

Once you have decided on the most appropriate methodology, you must produce a tool, to capture data. For open ended interviews, this may be nothing more than a simple form recording interviewer, interviewee, date and topic, although it is good practice to provide an information sheet which the interviewee can read and take away with them, and a consent form, allowing them to “opt in” to your study. Semi-structured and structured interviews will require a schedule, possibly with prompt questions (e.g. if the interviewee says they regularly use CD-ROMs, ask which databases and resources they use on this medium.), so that all staff conducting the interviews gather consistent information.

 

Questionnaires are the most complex tools to devise, as they must “stand alone” and the subject completing them may not be able to ask for clarification for any sections they do not understand. Such issues are highlighted in our short, light hearted competition on “how not to design a questionnaire.”

 

 

 

Testing, testing…

 

The most crucial stage for any questionnaire is piloting it. While developing the questionnaire, you may get friends and colleagues to review drafts, and point out the need for better wording or clearer instructions. You will also need a "dry run", ideally with similar people to those who will fill in the final version. You must also obtain feedback on where your questionnaire needs revision. If you are lucky (or clever!) and the questionnaire needs only superficial changes (or none at all!), you may still be able to use data from the pilot sample for the main study.

 

Recruitment

Now, you need to find subjects who are willing to complete your questionnaires, undergo interview or join a focus group.

 

One way to  recruit interviewees is a “exit interview” at the end of an encounter such as training session or a guided search. Time may be tight, but you may be able to book a mutually agreeable slot for a later session.  Brief interviews (a few minutes to twenty minutes) may be useful where there are a few closed questions, but the subject matter is too complex for a written questionnaire, and the interview allows the explanation of such issues. In depth interviews, allowing the full exploration of users’ feelings and experiences  through open ended questions can take an hour, and may require careful timetabling. These are costly techniques, and rather than a representative sample, you may need to concentrate on a few key members of the user group. Alternatively you may target those you have identified as having influence or reflections you are unlikely to get from other users. You do however need to be aware that such respondents will not be typical of library users: what they tell you may offer great insight, but not be very generalisable to the wider user population.

 

Remember to include clear instructions for returning questionnaires and a reply-paid envelope for postal returns. Questionnaires can be distributed in various ways:  a “help your-self” display in the public area; handing them out to users of desk services; or (assuming you have an address list of all service users), a mail shot. Although you can leave a stack of questionnaires by the door under a “Please complete one” notice, you will get better completion rates if you take a more active rôle, and approach people directly, so that you can explain the purpose of the survey. Suitable “captive audiences” can be those attending for training, using the enquiry desk or other services. Remember to have a brief but comprehensive cover letter: people are unlikely to return the forms unless they both understand the purpose of the survey and perceive it as useful. So, in addition to describing the study in the cover letter, you need to explain what uses it will serve, and specifically if it will benefit the subject in terms of improved information services. Increasingly, you may be able to use e-mail to distribute questionnaires, where your organisation uses lists of all staff or sub-lists, to distribute corporate information, but do make sure that the IT department feels that such use of these lists is legitimate. Or, you may be able to post the questionnaire on the organisation’s website, with an e-mail pointing users to an on-line form to complete and submit electronically. Beware that such “technological” solutions may specifically exclude more "Luddite" service users and deny them a voice.

 

Sometimes, you can combine techniques. A postal survey elicits numeric detail - why not ask if respondents would be prepared to be followed up with a fuller interview? From those who are willing to be interviewed, select those who give original, unexpected or atypical answers to explore particular issues. In both cases, questions must be “neutral” i.e. not phrased such that the subject feels that a specific reply is indicated, preferred or somehow “right.”

 

Low response rates may reduce the validity of any survey: since non-respondents may be quite different from respondents (different age group, different educational level etc), it is very important to locate at least a few of the “did not return” group and try and find out something about them.

 
Data quality and sampling

 

All methods raise questions of accuracy: people may have fallible memories. Alternatively, they may “censor” them to provide more acceptable answers, not wishing to show the organisation or its staff in a bad light. They may not wish to identify specific individuals responsible for defects in the service. If resources permit, triangulation (using data from more than one source to confirm findings) may help to increase accuracy. Whichever methods you use, you need to “benchmark” the quality of your data on two scales:

 

Reliability: are the answers you get consistent, and if you repeated the process a week later, would your respondents give you the same answer?

Validity: are you measuring what you think you are measuring?

 

Also, it is highly unlikely that you will have resources to survey all your users, even if they agreed. So you must decide how many subjects you need and how you will select a suitable sample. Ideally, the sample should reflect all users, so that you can generalise your findings to the wider population, in drawing conclusions and making recommendations. For most studies this means some form of random sampling to minimise bias. But, if you have particularly unusual or hard-to-reach user groups, or users who are very significant in their uptake of information services but numerically small, you may need to employ purposive sampling, to make sure you include their views.

Similarly, in setting up a focus group, the key to success may be to make sure that you cover all possible user perspectives. Another option is quota sampling, where you aim to select a certain number of users from each group. So, once you have twenty undergraduates, you stop recruiting from that group, and shift focus to twenty postgraduates. Convenience sampling, via personal networks and contacts, introduces more bias, but is useful for contacting “hard to reach” groups or non-users. Sample size is less straightforward than in a clinical trial and in many cases, you should continue to recruit subjects until saturation is reached (see below). In practice, resources like time and money determine how many interviews or questionnaires you can afford to collect. Assuming that you do have the luxury of being able to sample randomly, techniques like random number tables can be useful. The more heterogeneous (varied) the population, the larger the sample size needs to be. Alternatively, you must use stratified sampling to make sure you include representative members of all the sub-populations.

Once the study has commenced, you can start to develop suitable databases and coding frames to hold the information. For qualitative data, you can code responses to produce themes. Data saturation  (i.e. the phenomenon that more interviews / questionnaires do not generate new themes, and new subjects stop adding new insights but merely produce more examples of themes you have already identified) suggest that you have obtained all user perspectives.

 

 

Recommended reading:

 

  • Booth A (2003) A quest for questionnaires. Health Information and Libraries Journal  20 (1): 53-56.

  • Bookstein A (1985) Questionnaire  research in a  library setting  Journal of Academic Librarianship  11(1): 24-8

  • Fowler F (1995) Improving  Survey Questions: design and evaluation Thousand Oaks, California: Sage

  • Kalton G (1983) Introduction to Survey Sampling Newbury Park: Sage

  • Powell R (1997) Basic Research Methods for Librarians  Greenwich, Connecticut: Ablex

  • Howard  S, Presser S (1996) Questions and Answers in Attitude Surveys: experiments on question form, wording and context Thousand Oaks, California: Sage

  • Jorgensen D (1989) Participant  Observation: a methodology for human studies Thousand Oaks, California: Sage

  • Stone DH (1993) How to do it: Design a questionnaire British Medical Journal  307: 1264-6

 

 

 

     

Comments (0)

You don't have permission to comment on this page.