| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions. Connect your Gmail, DriveDropbox, and Slack accounts and in less than 2 minutes, Dokkio will automatically organize all your file attachments. Learn more and claim your free account.

View
 

pacinabriefing6

Page history last edited by FOLIO Team 11 years ago

 

FOLIO

Facilitated Online Learning as an Interactive Opportunity

 

 


 

 

Planning and Conducting Information Needs Analysis (PACINA)

 

Briefing 6

 February 2005

 

Analysing and applying the results of the audit

 

Introduction

Now we come to the bit which many regard as the key phases of the audit: analysis of  the data, the emergence of themes, drawing conclusions and the development and implementation of recommendations.

 

Organising the data

This may involve: input into suitable data bases; devising a data analysis plan; transcription, coding, correcting and editing data, preparatory to  formal analysis and mapping of data. At an early stage of analysis, or even earlier as you await your data, you will need to make pragmatic decisions about how to process the data, around issues such as:

 

  • How to code for missing inconsistent  or obviously erroneous data.
  • Data coding formats and frameworks, which will include missing or ambiguous data. Closed questions and pre-coded scales (e.g. Likert) will be easy to put into categories; for open ended questions and unstructured interviews, you may have to adopt an iterative process of developing new categories as you enter the data, until all options are provided for.
  • Whether, and if do how, you will convert textual comments to numerical codes or scales (never  =1, sometimes  = 2, often = 3, always = 4 etc). Closed questions, where you  force respondents to chose one from a set of mutually exclusive responses, are easy to code: in fact, the questionnaire can include a built in coding frame. Open ended questions require the identification and coding of concepts, with keys for synonymous terms.
  • What tools you will use (spreadsheets, formal statistical packages, qualitative soft ware like NUD*IST)

 

At this stage you may also assess the data for quality, and review issues like validity and reliability.

 

Data analysis usually encompasses one or more from four stages:

 

Developing a “feel” for the data: the important thing here is to avoid bias, by dropping any prejudices or preconceived ideas: otherwise there is a risk that selective use of the data will merely “prove what we already know.” Work here includes calculating  the response rates, including rates for sub-groups and the “hard to reach.”

 

Running statistical analyses: analyses may be either simple descriptive techniques or analytic (see below).

 

Running narrative analyses:  you will use this for textual analysis, such as completed questionnaires, transcripts of interviews. The concepts of the subject’s own words may need coding: for example “I think we need more specialised journals;”  “I often have to put in ILLs for journals” may all be coded as “shortfalls in current journal subscription policy.” This is where familiarity with the data is most useful, as you may need to keep devising new categories, until all responses have been listed and classified. Each category, or code, will need a definition (both in terms of exclusion and inclusion) to allow the researcher, or even more importantly a team of researchers, to code interview data consistently, and the final codes may produce a hierarchy:

            Comments about library and information services

                        Comments about  text books

                        Comments about journal holdings

                        Comments about data-bases

                                    Comments about CINHAL

                                    Comments about MEDLINE

                                                Access to MEDLINE

                                                Training in using MEDLINE

 

You may be able to afford the luxury of having two independent analysts code all data, and then compare their results to check for consistency, with a third as adjudicator where they cannot reach an agreement.

 

Drawing conclusions and making recommendations: this involves drawing the many responses together to see if patterns or common threads emerge, and if these indicate problems which require specific actions. It may involve two linked but distinct processes:

 

  • Evaluation: expressing the collected data in relation to the known or familiar and assessing the “value” of what the data shows. Issues here may include the tendency of some departments or individuals to hoard information; attempting to gain influence by reusing to share data; biased distribution of resources; use of out-dated or poor quality information; defects in information provision; duplication of information provision; lack of trails to trace information sources; failure to map information provision to user needs.
  • Interpretation: explaining what the data shows, making conclusions about the implications and assessing its organisational significance, including the identification of any gaps or problems, and their analysis in terms of resources, solutions and strategic significance.

 

At the same time, a few loose ends need tidying up, such as archiving data and destroying any confidential material which you see no further use for, or fully anonymising it if you can see some future research potential.

 

Descriptive and analytical statistics

Basically, statistical analysis falls into two categories:

 

Descriptive: means simply using the data to describe what is happening. Typically this involves presentation of the raw data as tables, graphs or charts, such as histograms of book borrowing for each month of the academic year or a pie-chart of age groups of those using your service. Most data is amenable to such presentation, although some may require classification e.g. a free-text question about satisfaction with information services may require that the answers are classed as “unsatisfied, “ “generally satisfied,” “satisfied” or “very satisfied.” Such presentation makes data easier to “eye-ball,” and you can make simple observations from it “65% of our users are aged 44-65.” But, it does not allow you to make inferences or develop ideas about correlation or causation.

 

Analytical:  here various mathematical applications can be used to try and “explain” data, by establishing correlations between variables; comparing rates between groups to see if they suggest genuine differences between those two groups, or just represent random chance; and calculating the probability of alternative  hypotheses.

To perform analytical tests, you need to translate your data into some sort of numeric score, and for textual data (such as verbal answers to questionnaires or responses in interviews) that will need some sort of coding.  Usually, simple descriptive statistics will suffice for most Information Needs Analyses. If you do wish to apply statistical tests, there are now numerous commercial software packages (e.g. SPSS) providing the technology, although the advice of a statistician is highly recommended to make sure you use the right tests.

Most numeric data can also be defined by two characteristics:

 

Range or distribution: the “spread” of data from highest to lowest value e.g. age, income. The range can be distorted by a very few outliers. So or instance in a class of 30 children twenty-nine could be 4’ 6” to 5’ 1” tall, but one child with some strange hormonal condition might already be 6’ 2”. The range of heights would then be 4’ 6” – 6’ 2”, but there would be no children in the  5’ 2” –6’ 1” range. So, distribution is often converted into calculated terms like the standard deviation (which take the number of individual readings and the distance each lies from the centre into account) or confidence intervals.

 

Measure of central tendency: an idea of where the “mid-point” of the data lies. There are three mathematical terms commonly used for this:

  • Mean: strictly arithmetic mean, the commonest, often more loosely referred to as “the average.” Add all numerical values together and divide by the number of items (hence the average family size in England is 2.4 children)
  • ·Median: if all the data were arranged in order of size, and you counted along until you came to the middle one (the fifth from either end of the scale for nine individuals), that would be the median. Effectively it divides the sample into two, with the same number of individuals above and below the median. If you have an even number of samples, the convention is to take the arithmetic mean of the middle two: so for eight readings, take the mean of the fourth and fifth.
  • Mode: the most commonly occurring value.

 

There is a specific symmetrical distribution in biology (the Normal curve, for parameters like height), where these three are all the same. But, in other distributions they may not be. Classically, mean income is much higher than median, because of the distortion cause by a small number of very large salaries. Similarly, the figures for average book borrowing may be distorted by a small number of very avid readers among a much larger population.

 

Analysis of qualitative data

If you have conducted interviews, or used questionnaires which collect free text, you may have a large volume of qualitative data. The object of qualitative analysis is to develop concepts which will help us to understand  social phenomena as they occur in natural rather than experimental situations. The analysis of qualitative data does involve more than just “reading it and pulling out a few illustrative quotes.”

There are even  specially developed computer programmes to help with the process (such as NUD*IST and Atlas-ti), although as the work load for in-putting text into such packages can be very time consuming, most workers would only use these for very big data sets. The following techniques can be used on qualitative data:

 

Grounded theory: the researchers collect the data with a completely open mind and no pre-conceptions.  As they work through the data, they generate theories to explain it. Ideally, this should reduce bias and value the opinion of the respondents, while identifying meaning from their perspective. It is especially good where there are several cycles of data collection. However, it is very hard to come to a set of data with your mind completely tabla rasa: invariably, opinions, background and any literature searching you have done will mean you approach the data with some “conceptual baggage.”

 

Content analysis: the systematic analysis of text by identifying and grouping themes, and then coding and developing categories of responses. In iterative content analysis, as each of these categories emerges, the researcher reviews all text to see if he can identify further examples.

 

Analytic induction: use of constant comparison, specifically in developing  hypotheses which can then be tested in further data collections.

 

Ethnography: the researcher acts as observer of the subjects, either as passive non-participant or as an active participant in the area under study. Usually this is applied to fairly long term sociological or anthropological  studies, where the researcher goes to live in a community he wishes to study, and where there is time for him to “bed in,” stop being an outsider and even gain the confidence of key informants in the community. There is however always the risk that the observer may become so integrated into the community that he “goes native.”

 

Framework Approach:  has been developed for qualitative research where the objectives of the investigation are set in advance and shaped by the information requirements of the funding body and the timescale is short. It starts deductively from pre-set aims and objectives. The data collection tends to be more structured than would be the norm for much other qualitative research and the analytical process tends to be more explicit and more strongly informed by a priori reasoning. This is achieved using five stages (Ritchie and Spencer, 1993; Fielding, 1993):

 

  • Familiarisation: immersion in the raw data to list key ideas and recurrent themes.
  • Identifying a thematic framework of all the key issues, concepts, and themes
  • Indexing: applying the thematic framework systematically to all the data by annotating the transcripts with numerical codes from the index.
  • Charting: rearranging the data according to the appropriate part of the thematic framework to which they relate.
  • Mapping and interpretation: using the charts to define concepts, map the range and nature of phenomena, create typologies and find associations between themes with a view to providing explanations for the findings.

 

Triangulation: bringing in data from another source; using two researchers (e.g. having an observer as well as a facilitator in a focus group).

 

Reporting the audit……

The end product from data analysis should be a conclusion of the audit findings, and specifically answers to the questions you posed when planning the audit. This involves drawing the many responses together to see if patterns or common threads emerge, and if these indicate problems which require specific actions. Since “the job is never complete until the paper work is done,” the output must be a report, which concisely but accurately describes the audit and its findings. The report provides a final archive of the whole project, although the only people who need a copy may be the audit team and key stakeholders who will decide about resource issues  or implementing action plans. There are numerous ways of writing a report, but you should include the following sections:

 

  • Introduction: setting the scene, background information, the rational for the audit, the questions or problems it sets out to address. If you have conducted a literature review, there should also be a section summarising that, and placing the audit in the context of what is already known.
  • Methods: a description of the techniques used in the audit and data collection tools.
  • Results: a summary of the data. Raw data is often of limited use to most of the report’s potential readers, so it should be omitted or placed in an appendix, or you can add a footnote saying “available from….on request.” Your report should however include a summary of the data, using descriptive statistical techniques (tables, graphs etc), and possibly some textual quotes.
  • Discussion: you may wish to raise any issues about the accuracy of the data, specific problems encountered and solutions employed, and how your findings agree with or question the literature review.
  • Conclusion: the findings of the audit, presented factually rather than in a speculative style.
  • Recommendations: how the findings of the audit may be translated into practice, or an action plan. This may include a timetable for  dissemination, monitoring and re-audit (see below).
  • References and citations:  of books, journals, websites, reports consulted
  • Acknowledgements: for help received.

 

Another potential output from an information audit is to contribute to the development of an information policy, including aspects such as:

 

  • A mission statement, summarising the purpose, values and objectives of the organisation
  • Information policy statement, providing clear guidelines for effective information management within the organisation
  • Objectives of the information policy, ideally with some criteria for monitoring how effective they are.

 

…..making recommendations…

Although these should feature in the report, they will be the last part of it to emerge, after data analysis. Here you must be realistic, steering between what the various stakeholders would like in an ideal world, and what you can provide within the limits of the organisational resources. You may also need to think ahead to the wider implications of any such action plans on other services: a re-structuring of the service may produce both winners and losers. In producing an action plan, you may need to go back to your users and ensure that they have a voice in deciding how the study results will modify service provision. For each problem, the action plan may need to include:

 

  • Identification of the problem (including the implications for not “fixing” it, but continuing the status quo)
  • Identification of solution, with a detailed analysis of what is needed to deliver it.
  • Plan for implementation
  • Costings and timetable

 

…..and implementing them

So far, you could say the assessment has been a paper exercise, collecting, analysing and reporting on user needs. Now (deep breath!) we come to the bit that really matters, the very reason why you embarked on this project: what are you going to do with the results and how will you use the conclusions to inform practice? There may however at this stage be some “handing on the baton” to another team who will implement your recommendations, especially if the assessment was commissioned by a management group, to help inform a strategy they wished to design.

 

The audit has progressed as far as a report, albeit one which may include a detailed action plan. The risk is that it will not get beyond that: “Just put it on the shelf with all the other reports.” Implementation involves moving from the current situation to a more desirable one, and moving from the theory of the conclusions in the action plan, to the practicalities of achieving them, moving from where the organisation is to where it needs to be. You may recall that we mentioned a key phase in planning the audit was gaining the ear of significant senior staff, or “sponsors,” and at this stage their continued support becomes crucial. Sometimes, implementation may not be problematic; at other times it may require resource allocations; involve long term policy changes; have significant implications across the organisation or any combination of these. Delivering this may require expertise in other disciplines such as change management. General principles for implementation however involve:

 

  • Clearly state the goals, so that all involved know why change is proposed and what the anticipated benefits are.
  • Clearly state the process by which the organisation will achieve it, including timescales and costs.
  • Understand who the changes will affect, and the stages staff may go through during the process: shock, denial, depression, optimism, acceptance.
  • Relate the proposed changes to organisational culture.
  • Clarify managerial expectations and support at each stage of the implementation process.
  • Communicate clearly before, during and after the implementation.
  • Anticipate any difficulties (and if possible devise solutions) and make sure there are pro-active key players, who can act as “agents of change.”

 

Sharing the results

Assuming that the end product of the assessment will be a set of recommendations or an action plan, you will need to communicate your findings to your stakeholders, otherwise you run the risk of a perfect audit, the report of which simply gathers dust, rather than achieving any useful change. At this stage you may need to tailor the way you present the conclusions, into different forms for users, staff and management, and another style altogether if you have plans to publish, depending on what your target journals are. Nowadays, there are many ways to publicise the findings:

 

  • A formal report (see above)
  • An oral presentation or seminar, either as a one off lunch time event, or as part of a larger programme, such as the next away-day. Generally, this does not need the level of detail of the written report, but you need to anticipate and prepare for questions. There may also be time for discussion and clarification. Illustrations (especially graphs) can be very useful, but must be linked to the spoken presentation, and you may be able to take advantage of specific technologies, like PowerPoint.
  • A short report or “executive summary” of the key themes and conclusions for wider dissemination
  • A poster for display on public notice-boards
  • A short section in the next organisational newsletter
  • A page on the organisation’s website.
  • Personal feedback or de-briefing sessions to key stake-holders and survey participants.
  • You may consider writing up the audit and submitting it for publication in a journal or a conference presentation, if you feel that it demonstrates generalisable principles, which other workers in the same area could apply.

 

Information audit as a continuum

It is easy to see the audit as a “one off,” a chore which staff must complete once, or maybe once a year, like spring cleaning or stock taking. But, increasingly it is becoming a continuous process, monitoring a changing environment and ensuring that information provision meshes as well as possible with users’ information needs, especially in rapidly changing fields, where those needs are constantly evolving.  Where audits do lead to the implementation of policy changes, you also need methods to evaluate the impact of those changes. For major innovations, the cycle of monitoring and feedback may be long: it maybe up to twelve months later before it is realistic to measure impact. In may cases, you will need to set up an audit cycle with periodic re-audits to assess impact. This serves two purposes:

 

  • You can see if the action plan has  delivered the anticipated improvements or whether further changes are needed to deliver the improvements and “fine tune” as required.
  • It can tell you if the environment is changing and a new action plan is needed.

 

References:

  • Ritchie J, Spencer L. (1993) Qualitative data analysis for applied policy research. In: Bryman A, Burgess R, eds. Analysing qualitative data. London: Routledge :173-194.
  • Fielding N (1993) Ethnography In: Fielding N, ed. Researching Social Life. London: Sage: 155-171.

 

Recommended reading:

  • Bryman A, Burgess R (1993) Analysing Qualitative Data. London: Routledge.
  • Dubois CPR (1995) The information audit: its contribution to decision making Library Management  16 (7): 20-4
  • Henczel S (2001)  The Information Audit : a practical guide München: K.G.Saur
  • Miles M, Huberman A (1984) Qualitative Data Analysis. London: Sage
  • Pope C, Ziebland S, Mays N (2000) Qualitative research in health care: Analysing qualitative data British Medical Journal 320: 114 – 116 and on-line at: http://bmj.bmjjournals.com/cgi/content/full/320/7227/114
  • Powell R (1997) Basic Research Methods for Librarians  Greenwich, Connecticut: Ablex
  • Westbrook L (2001) Identifying and analyzing user needs: a complete handbook and ready-to-use assessment workbook with disk New York: Neal-Schuman Publishers.
  • Worlock D  (1987) Implementing the information audit Aslib Proceedings  39(9): 255-60

 

 

     

Comments (0)

You don't have permission to comment on this page.