header
vol. 15 no. 3, September, 2010

 

Proceedings of the Seventh International Conference on Conceptions
of Library and Information Science—"Unity in diversity"

Systematic reviewing, meta-analysis and meta-synthesis for evidence-based library and information science


Christine Urquhart
Department of Information Studies, Aberystwyth University, SY23 3AS, UK


Abstract
Introduction. Systematic reviews are common in the health sector and meta-analysis used to integrate quantitative analyses. For qualitative research, and the types of research common in library and information science, meta-synthesis methods might be appropriate. The aim is to examine how various meta-synthesis methods might integrate information behaviour and information literacy research.
Method. Literature review, with discussion based around two systematic reviews of information literacy.
Analysis. Overview of trends in information behaviour and information literacy. This is mapped to meta-synthesis methods.
Results. Information behaviour research follows several research paradigms. Within information literacy there is increasing interest in information literacy within a community of practice, rather than a set of individual skills and knowledge. Realist synthesis approaches, or critical interpretive synthesis could be used to integrate information behaviour studies with information literacy research.
Conclusions. Meta-synthesis requires careful attention to the questions to be asked, and the research paradigms used in the studies to be integrated.


Introduction

Systematic reviews shot to prominence in the health sector with the development of evidence-based medicine, and the establishment of the international Cochrane Collaboration which produces systematic reviews to ensure that up to date information about the effect of healthcare interventions is available throughout the world. Systematic reviews, as produced for the Cochrane Collaboration, are different from the traditional narrative literature review in providing more transparent accounts of the searching process, the decisions about what should be in included and excluded in the review, with clear criteria for quality assessment of the research studies discussed.

With the immediate attractions for governments in citing the 'evidence base' to support their policy decisions, systematic reviewing has become a growth industry with many reviews aimed at policymakers or practitioners purporting to be systematic reviews of the literature. Interest in evidence-based practice has spread from the clinical disciplines to education and welfare. This has led to guidance for the health journals, at least, of guidelines of what should be expected of articles that claim to be systematic reviews (Moher et al. 2009). This is important as some systematic reviews will contain meta-analyses of the quantitative data, to come to a decision on whether an intervention is effective or not.

The systematic review industry has provided opportunities for library and information professionals in supporting the production of systematic reviews through devising search strategies, search filters, and managing the collection of references for the reviews (Beverley et al. 2003).Many information professionals who started out supporting systematic reviews for clinicians have transferred their skills and knowledge to their own subject discipline. There have been developments too, with biennial conferences under the Evidence Based Library and Information Practice series of conferences, with the fifth conference held in Stockholm 2009. The health sector was probably the most prominent grouping at the conference, unsurprisingly, but there was a strong representation from libraries in higher (and further) education, and some representative from schools and public libraries. There is now a journal Evidence Based Library and Information Practice, which includes evidence summaries that are intended to critique and summarise research studies for practitioners. Within the general literature, changes such as the introduction of structured abstracts, by Emerald and other publishers are intended to make it easier to identify articles of interest, and perhaps more honest reporting by researchers of the main aspects of their research.

One of the main difficulties in transferring the clinical systematic review model to library and information science is the different knowledge base. The health sector ranks randomised controlled trials as the gold standard study design, and such designs are rare in library and information science. Some Cochrane Collaboration groups are more inclusive in the study designs that may be included in their reviews, and the Effective Practice and organization of Care Group includes controlled before and after studies as well as interrupted time series. Controlled before and after studies are feasible for assessing many information literacy interventions, and interrupted time series designs might be appropriate for some evaluations of information systems implementations. Sadly, the usual problem for the latter is a lack of measurement points prior to the implementation.

The type of evidence that is prevalent in library and information science is the observational survey, often a mixture of quantitative and qualitative data, or ethnographic research with qualitative data drawn from various sources such as interviews, observation in fieldwork, focus groups. Traditional library and information science reviews, such as the Annual Review of Information Science and Technology, are generally narrative literature reviews, with some elements of mapping reviews (as defined in a typology by Grant and Booth (2009). The organization of the review is often intended to provide a useful state-of-the-art review, with brief details provided of the relevant research studies, with perhaps some organization by study design, or theme, or chronology. Such reviews may help to identify future research needs, or contain some lessons for practitioners but they are, generally, very even-handed in their treatment of individual studies, and it is often difficult to assess whether one study is more credible than another.

My own interest in meta-synthesis problems dates from a time when I tried, unsuccessfully to make sense of a mass of different studies on nurses' information seeking. Part of the problem was the difficulty of comparing findings from studies that had used slightly different questions, and the lack of use of validated research instruments is still a problem in library and information science. Use of standard questionnaires would make it easier to make more valid comparisons of observational surveys. There are signs of progress here, but the first problem is often persuading novice researchers such as postgraduate students that it is not only acceptable, but desirable, to replicate.

A deeper problem arises from the different standpoints taken by researchers. The literature on the use of electronic information services could be divided quite neatly into two groups, that of the researcher looking at information behaviour and that of library management examining use of resources (Rowley and Urquhart 2007). Those interested in information literacy (usually practitioners) and those interested in information behaviour (usually academic researchers) need to be aware of the research done by the other group, but, as Limberg and Sundin 2006 note, the interactions between the two groups of researchers and developers have been limited. This is partly the problem of asking different questions, but the research perspectives are likely to be different.

Bates contrasts the nomothetic approach to research, trying to establish the general laws underlying something with the idiographic approach, concerned with the individual, which 'cherishes the particulars' (Bates 2005: 9). Similarly Fuller (2003: 430) contrasts two strategies for generating philosophically interesting problems of knowledge: 1) generalising from the individual case (adding insight) (the scientific approach), or 2) fully realising the universal, redistributing something already present such as knowledge or power. The nomothetic approach is the scientific, positivistic approach and appropriate to determining where, when and how resources are being used. The idiographic is concerned with explanation, answering the 'why' questions, and associated with the constructivist approach, acknowledging that not everyone will share the same ideas about what is happening, but that there can be some consensus.

Trying to integrate findings from studies done from different standpoints is difficult – here be dragons of different viewpoints of what counts as valid knowledge, as well as the practical problems of trying to convert qualitative 'important' findings to something can be melded with quantitative data. Nevertheless, policymakers in governments want to have answers to questions about social policy, and social scientists themselves need to assess their research priorities. There is a demand for transparent methods of synthesising the findings of qualitative research studies, and qualitative and quantitative research. In the social sciences, synthesis of qualitative research findings is a better descriptor of the process for qualitative research, and the term meta-synthesis is used to distinguish this from quantitative meta-analysis. Meta-synthesis may also be used to integrate the findings from quantitative and qualitative studies. Unsurprisingly, there are many approaches to meta-synthesis, and a review of meta-synthesis methods for qualitative research (Barnett-Page and Thomas 2009) lists around ten methods. For meta-synthesis of qualitative and quantitative research there are several methods, including some that work for purely qualitative research as well (Mays et al. 2005). For meta-synthesis aimed at policymakers, realist synthesis provides a framework for answering questions about what works for whom, when and how (Pawson 2002) (Pawson et al. 2005).

Aims and objectives

The aim of the paper is to examine some meta-synthesis methodologies that might be relevant to researchers in information behaviour, and practitioners in information literacy. Research strategies are considered, and two systematic reviews on information literacy taken as the basis for discussion of meta-synthesis methods. The paper concludes with some recommendations for integrating the research findings from researchers, typically in information behaviour or e-learning research, and practitioners who are organizing and evaluating information literacy programmes.

Methods

The approach is literature review based. A companion paper ( 2011) discusses identification of trends in research strategies used in information behaviour research and the findings are summarised for this paper. This companion paper also identified major reviews of meta-synthesis methods published to 2009.

In addition to the two systematic reviews identified for information literacy (in the health and academic sectors), a further search of the literature was made to find other overviews or reviews around information literacy, to reveal any other factors that might influence how meta-synthesis might usefully proceed.The two examples of systematic reviews are discussed, with emphasis on the gaps in evidence revealed. Various approaches to meta-synthesis are discussed, to illustrate how meta-synthesis might help researchers, practitioners and policymakers.

Findings: information behaviour

In the companion paper(Urquhart 2011) reviews of information behaviour research were examined to assess which research inquiry paradigms were discussed. The categories used were those provided by Guba and Lincoln (2008: 260-261) and those used by Barnett-Page and Thomas (2009 in their review of qualitative methods of meta-synthesis. As the companion paper acknowledges, finding agreed definitions is difficult, and conflating two typologies risky. The attempt was made to help identify suitable approaches for meta-synthesis of information behaviour research studies.

Several of the reviews attempted to cover a wide range of information behaviour research studies, and inevitably a range of inquiry paradigms, often not explicitly stated. The findings indicated a continuing strong interest in psychological contributions to information behaviour research, although this is moving from cognitive psychology (traditional positivist/post-positivist) to evolutionary psychology, and ecological psychology. Spink and Cole (2006) propose the integration of evolutionary and social, spatial and collaborative, and multitasking frameworks. The other strong theme is constructivism. A few reviews (Hepworth 2007) (Mutshewa 2007) suggest that aspects of power, and wider cultural, societal influences are rarely considered in information behaviour research. Some researchers propose participatory approaches, with emphasis on seeing the information user within a network with other colleagues, or peers, or family, or as part of a community of practice.

Metasynthesis methods

One major comprehensive review (Barnett-Page and Thomas 2009) covered a range of meta-synthesis of qualitative methods, including meta-ethnography, critical interpretive synthesis, grounded theory approaches, meta-narrative and thematic synthesis. Some, but not all, of these methods could be applied to synthesis of qualitative and quantitative research studies on the same topic. Other reviews e.g., Mays et al. (2005) specifically discussed methods for the integration of qualitative and quantitative data. Some of the reviews that deal with the integration of qualitative and quantitative research data emphasise the importance of the questions asked by policymakers. Realist synthesis embraces a programme theory of change, asking questions such as which programmes work for whom, in what circumstances, but keeps open the idea that there may not be a consensus result from an aggregation of the evidence. Realist synthesis operates within a realist paradigm that accepts the possibility of causal mechanisms but there is an emphasis on explanation, and this approach has been proposed for evidence-based policymaking (Pawson 2002) (Pawson et al. 2005).

With meta-analysis of quantitative evidence, it is assumed that, with the same body of evidence, and looking at the same outcomes, meta-analysis should come up with the same, or very similar, quantitative conclusions even if slightly different statistical methods are used. This may not hold for qualitative, or quantitative-qualitative meta-syntheses, for several reasons. First, there are different ways of integrating quantitative and qualitative evidence (Dixon-Woods et al. 2006) and the method used may affect the results obtained. Second, qualitative meta-synthesis is much fuzzier on the benefits of appraising research studies prior to synthesis. Approaches that are based loosely or firmly on grounded theory may prefer a more iterative approach and studies, that are otherwise 'unworthy' may provide useful evidence in later stages of the synthesis – a theoretical sampling approach (Dixon-Woods et al. 2006). Third, ambiguous evidence is difficult to handle – the concept of confidence levels in quantitative analysis has to be translated into a different language of risk, and the complications for considering the generalisability or transferability of the findings mean that different conclusions may be drawn for policymakers. Boaz and Pawson (Boaz and Pawson 2006) note that five reviews, of a politically contentious issue, came to different conclusions.

Barnett-Page and Thomas (2009) also note that proponents of one method or the other do not necessarily cite each other, even if there are clear conceptual links. This makes it difficult for newcomers to meta-synthesis to distinguish the true differences and similarities between the methods.

Information literacy reviews

There have been systematic reviews of information literacy training and support (for health libraries, Brettle (2007) examined the reliability and validity of the measures used); Koufogiannakis and Wiebe (2006) compare different teaching methods for academic libraries); Zhang et al. (2007) compare traditional and computer-aided instruction). The conceptualization of information literacy in Europe, implementation of information literacy programmes (under various titles) in the schools and higher education sectors, organizations and institutions concerned with information literacy in Europe, and some examples of research programmes has been discussed (Virkus 2003 ). Some of debates mentioned in the Virkus review, e.g., the relationship between information technology, information, media and digital literacy highlighted by Bawden (2001), still occur in recent literature, with more critique of the term literacy (Buschman 2009) or development of frameworks based on information and communications technology literacy (Markauskaite 2006). There is increasing emphasis on information literacy understood not simply as the knowledge and skills of the individual, but as part of a community, as people in the fourth age (Williamson and Asia 2009),or with more emphasis on people's communities of practice (Harris 2008). Literacy is thus seen as part of people's situation, and we return to ideas of critical literacy and empowerment. Sundin (2008) contrasts some the assumptions present in university Web-based tutorials on information literacy. The approaches highlight the different assumptions made about information literacy. Some tutorials stressed sources and services (with implicit preferences for the library's chosen collections rather than documents found on the Web), others stresses the skills of information searching (often with an emphasis on Boolean searching), or the process of searching, devising an effective strategy for the task in hand. Only the communication approach treated the student as a learner within their disciplinary community of practice.

How, therefore, can systematic reviews of information literacy address some of these concerns, of what is meant by information literacy, and what information literacy programmes might achieve, or should achieve? All three reviews Koufogiannakis and Wiebe (2006), Zhang et al. (2007) and Brettle (2007) illustrate many features that are very familiar to those involved in systematic reviews for the Cochrane collaboration. The reviews provided details of the search strategies used for different databases, the inclusion and exclusion criteria for studies that met the scope, and quality criteria for further analysis, and a flow diagram giving details of how the final number of studies for detailed review was obtained. One systematic review (Koufogiannakis and Wiebe 2006) had one broad question to answer, the overall state of research on the topic of library instruction methods, and one narrow question, Which teaching methods are more effective? The first hypothesis for testing, that instruction taught by a librarian face-to-face is more effective than instruction that is computer-based, was addressed through analysis of eight of these studies that provided sufficient data for a meta-analysis, as there was sufficient information to calculate the standardised mean difference despite the different measurement tools used in the controlled trials. A later systematic review on a similar topic (computer-aided instruction versus face-to-face instruction, Zhang et al. 2007) used different quality criteria and concluded that meta-analysis was not possible with their criteria. Both reviews, however, came to an overall conclusion, that there was no difference between computer-aided instruction and traditional teaching methods.

Koufogiannakis and Wiebe (2006) also examined the effectiveness of self-directed learning versus no instruction, and traditional instruction versus no instruction. The authors stress that there are different statistical methods for pooling and the method chosen here took account of the expected subject-subject variation, and study-study variation (a random effects meta-analysis). Brettle (2007) examined information skills training in health libraries and the focus here was the measures used to evaluate the effectiveness of training and whether those measures are valid and reliable. Like the other two systematic reviews, a large number of potentially relevant citations were screened before identifying fifty-four studies that met the criteria (and of these a much smaller number considered the validity and reliability of the measures used). Meta-analysis was not appropriate for this review.

Brettle (a href="#bre09">2009) asks how to decide what sort of evidence should be included in a systematic review for library and information science and discusses the problems of the 'quality of the evidence'. For quantitative study designs, there are accepted guidelines to determine the quality of a controlled trial, and the approach adopted in systematic reviews of the Cochrane variety is to conduct appraisal first, winnowing out the lower quality studies. There are checklists for appraisal of qualitative research, but, as Brettle notes, assessing quality for studies in library and information science is difficult. Perhaps, as indicated, this is less of a problem for some meta-synthesis approaches (Dixon-Woods et al. 2006).

The meta-analysis approach described by Koufogiannakis & Wiebe (2006) had practical questions to answer, such as the impact (if any) of training, and whether one method of training is better than another method. In the overview of the types of learning outcomes assessed they note that the higher levels of learning outcomes (Bloom's taxonomy) were rarely measured. If we are moving to a view of information literacy that encompasses individuals within a community of practice, then qualitative research might help explain why the higher levels of learning outcomes are rarely measured, and if they are to be measured, who should do the measurement and how. Using a meta-synthesis approach would allow the qualitative research to be included in the review, and at an early stage, too, rather than as an after-thought to the meta-analysis. First, perhaps, it is important to identify the research strategies and inquiry paradigms that would be most useful. There would be literature from the education sector on the ways of thinking and practice in different disciplines. From the information behaviour research, studies that are conducted using a participatory research inquiry paradigm would help to explain why and how members of a community of practice view their information practice. Some appreciation of critical theory and constructivist approaches is also necessary when examining the relevance of research studies, as members of these communities are unlikely to use the terms information literacy or information practice, and for many disciplines, information literacy is more concerned with numeracy or media skills (Urquhart 2007). Such qualitative research might help to explain which outcome measures are important for the information literacy that matters to that community. Brettle (2007) comments on the way assessment of searching skills may be incorporated into an Objective Structured Clinical Examination for medical students. This may not be ideal if higher level skills and behaviour are to be targeted, but it does convey the message that searching skills are part of clinical practice for doctors.

University library managers need to make decisions about the type of support services they offer. For them, the results of one of the systematic reviews (Koufogiannakis and Wiebe 2006) indicate that teaching is generally more effective than no instruction, but that self-directed learning and computer aided instruction might be a solution. But managers might rightly ask which computer-aided instruction, which self-directed learning, and does this review (predominantly containing studies conducted in the USA) really apply to academic libraries in other countries where the model of higher education is different, and different conceptions of information skills and information literacy (Virkus 2003) may exist. A realist synthesis approach to meta-synthesis might help to answer questions of the variety – what type or types of training support work best for particular disciplines at particular stages of their educational career or within particular types of institution? That might require working with qualitative studies of information behaviour of students in different disciplines, and in different countries exploring and challenging assumptions about the desirability of certain skills. For example, a qualitative study (Makani and WooShue 2006) of the information behaviour of business students took into account the collaborative working expected of business students. The findings are partly quantitative, but the qualitative findings explain how and why some digital library features work for these students, while others are unappreciated.

A critical interpretive synthesis might help to illuminate what the added value to face-to-face instruction might be, why students like the self-directed learning, whether this in fact encourages them to reflect on something they just do (searching) as a means to an end (obtain material for an assignment). Critical interpretive synthesis is suited to reviews where there is a mix of qualitative and quantitative studies, and a larger number of studies than can be handled than meta-ethnography. Such synthesis borrows some ideas from grounded theory, just as meta-ethnography does. A meta-ethnographic approach might help to illuminate why students learn the skills, but then do not apply them routinely, why information practice changes or does not change. This question might be explored by a more in-depth analysis of around five to ten qualitative studies, translating concepts of one study to another. Meta-ethnography explores and explains differences between the studies in a systematic way. One study had data that could have been used to examine how staff expectations (explicit or implicit) affected what students viewed as important (Urquhart & Rowley 2007). Unfortunately, as the latter example demonstrates, meta-ethnography might require digging through five years of research reports on a longitudinal project – the detail provided in one journal article is insufficient.

For information literacy, and lifelong learning, we need to take a step back and reflect on the wider issues of the expectations of education by the learners, what the social and cultural issues are. The critical aspects are important, as researchers themselves may be influenced by the type of education they have received, and when reading their aims, methods and findings, that at least deserves reflection. A meta-synthesis of information literacy and information behaviour should take into account different cultural practices, and different notions of literacy. One study (Williamson and Asia 2009) is an example of the type of qualitative study that might contribute to meta-synthesis, to answer questions public library managers might pose. Sometimes the social and cultural issues might emerge in the process of doing the meta-synthesis, by meta-ethnographic or thematic synthesis methods, but other approaches, such as meta-narrative might be used to illuminate different ways of understanding information practice, information behaviour and information literacy. Much depends on how wide the scope of the meta-synthesis needs to be, and the resources available to conduct a meta-synthesis.

The companion paper (Urquhart 2011) discusses experience of doing a meta-synthesis of research on women's information behaviour (Urquhart and Yeoman 2010). In that the research studies were sorted into categories, and a meta-ethnographic type of approach applied, although the approach was probably nearer critical interpretive synthesis as meta-ethnography was impractical given the time constraints.

Conclusions

Systematic reviews of library and information science research has seemed difficult due to the lack of research that uses designs that are appropriate for meta-analysis. Practitioner research (in information literacy) often seems to diverge from researcher interests (in information behaviour). Meta-synthesis methods are not easy, but they do offer some possibilities for integrating information literacy and information behaviour research findings. This requires careful considerations of the research paradigms used, their assumptions about the way knowledge is generated and what constitutes valid knowledge. Fortunately, perhaps, some of these paradigms map neatly on to some of the trends in information literacy, as, for example, the interests in information literacy within a community, map to studies that examine information behaviour among professional groups or a 'network' of older people.

It is very early days to make recommendations about the appropriate methods of meta-synthesis that library and information science researchers should use – and much depends on who wants the results of the meta-synthesis. Realist synthesis may answer questions for policymakers and managers. Practitioners may require critical interpretive syntheses, or meta-ethnography to understand why some information literacy programmes work better than others.

Acknowledgements

The author thanks reviewers, and the CoLIS conference organizers, and participants for constructive suggestions on the paper.

About the author

Christine Urquhart has directed several studies of information seeking and use in the health sector, and also co-directed a longitudinal study of the impact of electronic information services on the information behaviour of students and staff in UK higher and further education. She also prepares systematic reviews for the Cochrane Collaboration, principally the Effective Practice and organization of Care group, and is a co-author of reviews on nursing record systems, and telemedicine. She was Director of Research in the Department at Aberystwyth for several years and established the training programme for doctoral students. She can be contacted at: cju@aber.ac.uk

References
How to cite this paper

Urquhart, C. (2010). "Systematic reviewing, meta-analysis and meta-synthesis for evidence-based library and information science" Information Research, 15(3) colis708. [Available at http://InformationR.net/ir/15-3/colis7/colis708.html]
Find other papers on this subject



logo Bookmark This Page

© the author, 2010.
Last updated: 13 September, 2010
Valid XHTML 1.0!