Cardiff University  
 
 

 


Booking form Evaluation form Contact RCBN



      

THE PATH FROM STUDIES TO EVIDENCE

 

Introduction | RCBN Activities | References | Examples | Links | Software

Introduction (return to top)

This theme concerns itself with the question of what constitutes evidence, the process by which evidence is generated, and how evidence is used to inform decisions about policy and practice. Systematic reviews and meta-analysis have, in recent times, become key tools for the generation of evidence to inform decision making in policy and practice. Techniques for systematic review and meta-analysis rest on the critical cumulation of relevant, high quality study findings to reach decisions about the state of the evidence in relation to a policy or practice relevant question. This theme will therefore concern itself with building capacity in these skills, as examples of techniques for generating evidence-based policy and practice. Both systematic reviews and meta-analysis emerged from the RCBN’s consultation exercise as techniques for which there is high demand for capacity building and, respectively, medium and low current use. Capacity building in these skills will involve consideration of such topics as: the scope of the evidence base, systematic literature searching, research quality assessment, relevance, impact, cumulation of research findings, programmes of research, techniques for the conduct of systematic reviews and meta-analysis of various sorts, and a critical evaluation of the appropriateness of these techniques for generating evidence. It is envisaged that the treatment of these topics will involve training events, provision of relevant articles, overlap with the Teaching and Learning Research Programme (TLRP) director’s office work on impact, overlap with other RCBN themes, and web-based information.

RCBN Activities (return to top)

3 & 4 February, 2005 - Educational Evaluation, Methodology and Changing Political Contexts
This international event is the 4th and final seminar in the RCBN-supported series Qualitative research in teaching and learning: quality, innovation and future directions, co-hosted by the Education and Social Research Institute at Manchester Metropolitan University. As the culmination of the series, the seminar will explore international perspectives on possible futures for qualitative research, led by five speakers who are working at the leading edge of methodological and critical development in education and the social sciences. To allow for lively discussion and participation by all seminar members, this will be a two-day event. For more information follow this link.

28 January - Educational Evaluation, Methodology and Changing Political Contexts
One-day workshop, University of the West of England
Professor Saville Kushner and others will lead a critical review of the draft of a Manifesto for Educational Evaluation generated by the influential international series of seminars entitled, The Cambridge Evaluation Conferences. This sets out a political and methodological response to contemporary political and policy contexts in education. Conversations around the issues raised by the manifesto will be used as a basis for further deliberations over the role and nature of evaluation. For more information follow this link.

24 June 2004 - Situating qualitative research in evidence-based research and systematic review agendas
One-day seminar. University of Sheffield
Where does qualitative research fit in current political agendas for educational research? This seminar presents perspectives on the possibilities and realities of qualitative research for teaching and learning researchers in the evidence-based and systematic review agendas of public policy-makers. For further information and booking, please click here.

7 and 8 June 2004 - Design and Analysis of Randomised Controlled Trials (Friends House, London)
Two-day workshop, Friends House, London. Please click here for further information.

25 May 2004 - Research approaches to revealing tacit knowledge (Institute of Physics, London)
One-day workshop. For further information please click here.

20 May 2004 - Randomised Trials in Educational Research (Bonhill House, London)
One-day seminar. For further information, please click here.

12 May 2004 - 'Developing research capacity in qualitative inquiry' (New Hall, Cambridge University)
This one-day seminar explores some of the tensions and possibilities of developing research capacity in contexts of qualitative inquiry. While policy-makers and funding agencies are increasingly focussing on the capacity of researchers to carry out “quality” research, this seminar examines how qualitative researchers are responding to these demands. It presents a range of views on the commensurability of qualitative research with current conceptions of capacity building, and the extent to which the interests and intentions of qualitative research can be situated within these current frameworks. More information.

18 June 2003 - 'Specifics of Undertaking a Review' London (Institute of Education)
Systematic reviews: day three of a three-day workshop series in rationale for and conduct of systematic reviews; 1-day training workshop; 15 places; must have completed day one and two of workshop series. More information.

11 June 2003 - 'Developing a Systematic Review' London (Institute of Education)
Systematic reviews: day two of a three-day workshop series in rationale for and conduct of systematic reviews; 1-day training workshop; 15 places; must have completed day one of workshop series. More information.

2 June 2003 - 'Introduction to Systematic Reviewing' London (Institute of Education)
This one-day workshop will introduce interested participants to the history and rationale of systematic reviewing, exploring several different approaches to the conduct of a systematic review, and outline the main stages involved in undertaking a systematic review. More information.

31 March 2003 - 'Introduction to Systematic Reviewing' London (Institute of Education)
This one-day workshop will introduce interested participants to the history and rationale of systematic reviewing, exploring several different approaches to the conduct of a systematic review, and outline the main stages involved in undertaking a systematic review. More information.

September 2003
Meta-analysis: training in rationale for and conduct of meta-analysis; 1-day training workshop (Leicester – repeated in Exeter, October 2003); 15 places

October 2003
Meta-analysis: training in rationale for and conduct of meta-analysis; 1-day training workshop (Exeter – repeated from Leicester, September 2003); 15 places

February 2004
Adequacy of systematic reviewing and meta-analysis in teaching and learning research; 1-day conference (Cambridge); 20 places

April 2004
Criteria for greater rigour in ‘qualitative’ research; 1-day discussion workshop (Cardiff); 15 places

References for systematic reviews and meta-analysis (return to top)

Building Research Capacity journal

Nash, R (2002) A realist scheme for social explanation: on 'numbers and narratives', Building Research Capacity, 4, pp.1-4

Roberts, K (2002) Belief and subjectivity in research: an introduction to Bayesian theory, Building Research Capacity, 3, pp.5-7

Tymms, P and Taylor Fitz-Gibbon, C (2002) Theories, hypotheses, hunches and ignorance, Building Research Capacity, 2, pp.10-11

White, P (2002) Workshop evaluation: Introduction to evidence based practice, Building Research Capacity, 1, p.3


Please also browse the following themes for references:
Conducting systematic reviews and meta-analysis
Critiques and alternative / new approaches
Literature searching
Assessing the quality of studies for inclusion in systematic reviews and meta-analysis

Conducting systematic reviews and meta-analysis (return to top)

Altman, D. & Chalmers, I. (Eds) (1995) Systematic reviews. London: BMJ Publishing Group.

Badger, D., Nursten, J., Williams, P. & Woodward, M. (2000) Should all literature reviews be systematic? Evaluation and Research in Education, 14, 220-230.

Campbell Collaboration (2001) Campbell systematic reviews: Guidelines for the preparation of review protocols, 1. Available to download from the Campbell Collaboration web-site.

Clarke M. & Oxman A.D. (Eds) Cochrane Reviewers' Handbook 4.1.5 [updated April 2002]. In: The Cochrane Library, Issue 2, 2002. Oxford: Update Software. Updated quarterly. Available to download from the Cochrane Collaboration web-site.

Cooper, H. & Hedges, L.V. (Eds) (1994) The handbook of research synthesis. New York: Russell Sage Foundation.

Deeks, J., Glanville, J. & Sheldon, T. (1996) Undertaking systematic reviews of research on effectiveness: CRD guidelines for those carrying out or commissioning reviews. Centre for Reviews and Dissemination, York: York Publishing Services Ltd. Can be ordered from the NHS Centre for Reviews and Dissemination web-site.

EPPI-Centre (2001) Review Group Manual 1.1. London: EPPI-Centre, University of London. Available to download from the EPPI-Centre web-site.

Fleiss, J.L. (1993) The statistical basis of meta-analysis (review). Stat Methods Med Res, 2, 121-145.

Glass, G.V. (1976) Primary, secondary and meta-analysis of research. Educ Res, 5, 3-8.

Glass, G.V., McGraw, B. & Smith, M.L. (1981) Meta-analysis in social research. California: Sage.

Hardy, R.J. & Thompson, S.G. (1996) A likelihood approach to meta-analysis with random effects. Stat Med, 15, 619-629.

Hedges, L.V. & Olkin, I. (1985) Statistical methods for meta-analysis. London: Academic Press.

L’Abbe, K.A., Detsky, A.S. & O’Rourke, K. (1987) Meta-analysis in clinical research. Annal Int Med, 107, 224-233.

Light, R.J. & Pillemar, D.B. (1984) Summing up: the science of reviewing research. Cambridge, Mass: Harvard University Press.

Matt, G.E., Cook, T.D., Cooper, H. & Hedges, L.V. (Eds) (1994) The handbook of research synthesis. New York: Russell Sage Foundation.

Mengersen, K.L. & Tweedie, R.L. (1995) The impact of method choice in meta-analysis. Aust J Stats, 37, 19-44.

National Research Council (1992) Combining information: statistical opportunities for research. Washington DC: National Academy Press.

Olkin, I. (1996) Meta-analysis: current issues in research synthesis. Stat Med, 15, 1253-1257.

Sacks, H.S., Berrier, J., Reitman, D., Ancona-Berk, V.A. & Chalmers, T.C. (1987) Meta-analysis of randomized controlled trials. N Engl J Med, 316, 450-455.

Sutton, A.J., Abrams, K.R., Jones, D.R., Sheldon, T.A. & Song, F. (1998) Systematic reviews of trials and other studies. Health Technology Assessment, 2(19). 276pp. Available to download from the Department of Health National Co-ordinating Centre for Health Technology Assessment web-site.

Thacker, S.B. (1988) Meta-analysis. A quantitative approach to research integration. JAMA, 259, 1685-1689.

Critiques and alternative / new approaches (return to top)

Abramson, J.H. (1990) Meta-analysis: a review of pros and cons. Pub Health Rev, 9, 149-151.

Elphick, H.E., Tan, A., Ashby, D. & Smyth, R.L. (2002) Systematic reviews and lifelong diseases. BMJ, 325, 381-384.

Eysenck, H.J. (1994) Systematic reviews – meta-analysis and its problems. BMJ, 309, 789-792.

Greenland, S. (1994) Invited commentary: a critical look at some popular meta-analytic methods. Am J Epidemiol, 140, 290-296.

Jones, D.R. (1992) Meta-analysis of observational epidemiological studies: a review. J R Soc Med, 85, 165-168.

Lau, J., Ionnidis, J.P.A. & Schmid, C.H. (1998) Summing up evidence: one answer is not always enough. Lancet, 351, 123-127.

Letzel, H. (1995) ‘Best evidence synthesis: an intelligent alternative to meta-analysis’: discussion. A case of ‘either-or’ or ‘as well’ (comment). J Clin Epidemiol, 48, 9-18.

Olsen, O., Middleton, P., Ezzo, J., Gotzsche, P.C., Hadhazy, V., Herxheimer, A., Kleinjnen, J. & McIntosh, H. (2001) Quality of Cochrane reviews: assessment of sample from 1998. BMJ, 323, 829-832.

Roberts, K.A., Dixon-Woods, M., Fitzpatrick, R., Abrams, K.R. & Jones, D.R. (2002) Factors affecting uptake of childhood immunisation: a Bayesian synthesis of qualitative and quantitative evidence. The Lancet, 360, 1596-1599.

Scruggs, T.E., Mastropieri, M.A. & Casto, G. (1987) The quantitative synthesis of single subject research: methodology and validation. Remedial and Special Education, 8, 24-33.

Slavin, R.E. (1986) Best-evidence synthesis: an alternative to meta-analytic and traditional reviews. Educ Res, 15, 5-11.

Slavin, R.E. (1995) Best evidence synthesis: an intelligent alternative to meta-analysis (review). J Clin Epidemiol, 48, 9-18.

Stewart, L.A. & Parmar, M.K. (1993) Meta-analysis of the literature or of individual patient data: is there a difference? Lancet, 341, 418-422.

Thompson, S.G. (1993) Controversies in meta-analysis: the case of the trials of serum cholesterol reduction (review). Stat Methods Med Res, 2, 173-192.

Thompson, S.G. & Pocock, S.J. (1991) Can meta-analysis be trusted? Lancet, 338, 1127-1130.

Literature searching (return to top)

Begg, C.B. and Berlin, J.A. (1988) Publication bias: a problem in interpreting medical data (with discussion). J R Statist Soc A, 151, 343-353.

Counsell, C. & Fraser, H. (1995) Identifying relevant studies for systematic reviews. BMJ, 310, 126.

Coursol, A. & Wagner, E.E. (1986) Effect of positive findings on submission and acceptance rates: a note on meta-analysis bias. Professional Psychology, 17, 136-137.

Dickersin, K., Chan, S., Chalmers, T.C., Sacks, H.S. & Smith, H.J. (1987) Publication bias and clinical trials. Controlled Clinical Trials, 8, 343-353.

Dickersin, K., Scherer, R. & Lefebvre, C. (1994) Systematic reviews – identifying relevant studies for systematic reviews. BMJ, 309, 1286-1291.

Duval, S. & Tweedie, R. (2000) Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455-63.

Egger, E., ZellwegerZahner, T., Schneider, M., Junker, C. & Lengeler, C. (1997) Language bias in randomised controlled trials published in English and German. Lancet, 350, 326-329.

Fleiss, J.L. & Gross, A.J. (1991) Meta-analysis in epidemiology, with special reference to studies of the association between exposure to environmental tobacco smoke and lung cancer: a critique. J Clin Epidemiol, 44, 127-139.

Rosenthal, R. (1979) The file drawer problem and tolerance for null results. Psychol Bull, 86, 638-641.

Smith, M.L. (1980) Publication bias and meta-analysis. Evaluation in education, 4, 22-24.

Sutton, AJ., Duval, SJ., Tweedie, RL., Abrams, KR & Jones, DR. (2000) Empirical assessment of effect of publication bias on meta-analyses. BMJ, 320, 1574-1575.

Assessing the quality of studies for inclusion in systematic reviews and meta-analysis (return to top)

Cook, T.D. & Campbell, D.T. (1979) Quasi-experimentation: design and analysis issues for field settings. Boston: Houghton Mifflin.

Emerson, J.D., Burdick, E., Hoaglin, D.C., Mosteller, F. & Chalmers, T.C. (1990) An empirical study of the possible relation of treatment differences to quality scores in controlled randomized trials. Controlled Clin Trials, 11, 339-352.

Khan, K.S., Daya, S. & Jadad, A.R. (1996) The importance of quality of primary studies in producing unbiased systematic reviews. Arch Int Med, 156, 661-666.

Moher, D., Jadad, A.R., Nichol, G., Penman, M., Tugwell, P. & Walsh, S. (1995) Assessing the quality of randomized controlled trials – an annotated bibliography of scales and checklists. Controlled Clin Trials, 16, 62-73.

Schulz, K.F., Chalmers, I., Hayes, R.J. & Altman, D. (1995) Empirical evidence of bias: dimensions of methodological quality associated with estimates of treatment effects in controlled trials. JAMA, 273, 408-412.

Examples (return to top)

  • TLRP Example - Newman, M. (2003) A pilot systematic review and meta-analysis on the effectiveness of Problem Based Learning, Special Report 2, on behalf of the Campbell Collaboration Systematic Review Group on the effectiveness of Problem Based Learning. LTSN Medicine, Dentistry and Veterinary Medicine: Newcastle upon Tyne. ISBN 0 7017 0158 7
  • Smith, G.D., Song, F. & Sheldon, T.A. (1993) Cholesterol lowering and mortality: the importance of considering initial level of risk. BMJ, 306, 1367-1373.

Links (return to top)

Software (return to top)

  • Review Manager 4.1.1.
    Can be downloaded from the Cochrane Collaboration web-site, and is designed to manage the entire review process. A self-paced training exercise is also available from the web-site.

 


       
 
Home > Activities > Research design issues
   
Return to top of page
This page was last updated 10th November 2004
Email: RCBN@Cardiff.ac.uk