home  news  search  vre  contact  sitemap
 Capacity building resources

Resources from RCBN and Journal: communications and impact

Chris Taylor

Chris is a senior lecturer at the School of Social Sciences at the University of Cardff and co-director of the NCRM node Qualiti in Cardiff

How to reference this page

This guide, draws heavily on resources developed by the Research Capacity Building Network (RCBN) and its journal, Building Research Capacity, between 2000 and 2005, supplemented with other material.

Impact of Research

A major report was published in 2003 (Models of research impact: a cross-sector view of literature and practice – Building effective research: 4th report in the series) by Sandra Nutley, Janie Percy-Smith and William Solesbury for the Learning and Skills Research Centre that set out practical steps to enhance the impact of research on practice and policy. Based on a national study into the impact of research it analysed evidence from both research literature and actual practice across education, social care, criminal justice and healthcare sectors. In this, the fourth report in a series, the authors offer recommendations to the teaching, learning and skills community in enhancing the impact of education research. The full report can be viewed in Adobe Acrobat, but a summary of the main findings and recommendations was also produced by the TLRP RCBN:

Models of research impact: a cross-sector view of literature and practice – Building effective research: 4 (Sandra Nutley, Janie Percy-Smith and William Solesbury 2003)

TLRP RCBN Summary of ‘Models of research impact: a cross-sector view of literature and practice’ (Chris Taylor 2003)

In this short article, Dr Ben Levin (University of Manitoba, Canada) offers an insight into his views on increasing the impact and value of research in education. From his unique position, as someone who has spent many years traversing the research and public policy divide, Dr Levin discusses how impact occurs and offers possible steps forward in raising the impact of education research.

Levin, B. (2003) Increasing the impact and value of research in education, Building Research Capacity, 6, 1-3

In 2005-06 the TLRP funded a seminar series to examine a variety of forms of user-engagement, their purposes, and their implications. In the following short article the seminar organisers, Professor Anne Edwards (University of Oxford) and colleagues, summarise the discussion and conclusions made. IN particular they consider the spaces in which research and policy meet, the negotiations with policy communities that occur there, and the implications for these negotiations and for research design in pedagogic research.

Edwards, A., Sebba, J. and Rickinson, M. (2006) Working with users in educational research: some implications for research capacity building, Building Research Capacity, 11, 1-4
[hyperlink required]

The Quality and Impact of Qualitative Research
There are numerous debates relating to the quality of qualitative research in the social sciences and education research field. The issues of quality, validity and rigour are particularly important when considering the impact of qualitative research. There is a dominant perception amongst researchers that qualitative research does not have quite the same impact on policy and practice as quantitative research. The TLRP RCBN commissioned two academics who have worked closely at the interface of policy-making to discuss the role of qualitative research. The first, by Liz Spencer, presents a report prepared by a team at the National Centre for Social Research (NatCen) for the UK Government's Cabinet Office that develops a framework for evaluating qualitative research. In this presentation Spencer offers a framework for assessing qualitative evidence and the background to their report. Spencer's presentation can be viewed in MS PowerPoint:

Developing a framework for evaluating qualitative research

The full NatCen report is available from the UK Policy Hub:
A summary article of the report is also available by two of the report's authors, Jane Ritchie and Liz Spencer (National Centre for Social Research).

Ritchie, J. and Spencer, L. (2004) Qualitative data analysis: the call for transparency, Building Research Capacity, 7, 2-4

On the basis of that report the TLRP RCBN organised a seminar to consider the implications of the Cabinet Office report, ‘Developing a framework for evaluating qualitative research'. As a result of that seminar the TLRP RCBN invited participants to respond to the framework, including the report authors themselves. The resulting papers were published in a special issue of the Building Research Capacity journal.
First, there is a summary of the framework document.

Boyask, R. (2004) A summary of ‘Quality in Qualitative Evaluation: A Framework for assessing research evidence', Building Research Capacity , 8, 1-3

The next article is generally supportive of the document, praising it for it's clarity in a field fraught with complex methodological and philosophical debate. However, the authors suggest more work can be done to make this framework a more effective tool for the non-expert, providing more detailed discussion of qualitative methods to assist evaluative judgement making.

Murphy, L. and Dingwall, R. (2004) A response to ‘Quality in Qualitative Evaluation: A Framework for assessing research evidence', Building Research Capacity , 8, 3-4

The next author claims that in the absence of a strong tradition of educational evaluation in the UK, this framework may place more significant limitations on the possibilities of evaluation than the promise offered in the document.

Kushner, S. (2004) Government regulation of qualitative evaluation, Building Research Capacity , 8, 5-8

And the final author offers further criticism of the framework, and suggests that this framework is about evaluating the quality of qualitative evaluation, rather than a serious attempt to evaluate qualitative research on its own terms, i.e. qualitatively.

Torrance , H. (2004) ‘Quality in Qualitative Evaluation' – a (very) critical response, Building Research Capacity , 8, 8-10

We then have a response to these articles from the original report authors, Liz Spencer et al. In this they identify key areas of agreement and areas where their report has been misinterpreted.

Spencer, L., Ritchie, J., Lewis, J. and Dillon, L. (2005) Quality in qualitative evaluation: a response by the authors of the framework, Building Research Capacity , 10, 8-9
[hyperlink required]

The second presentation commissioned by the TLRP RCBN is by Lesley Saunders who is the Policy Adviser for Research at the General Teaching Council (GTC) for England. In Saunders' presentation she discusses the role of qualitative research to inform policy-making through the use of systematic reviews. In particular Saunders draws upon the experiences and practices of the GTC. This presentation can be viewed in MS PowerPoint:

Qualitative research: ethical evidence for policy making?

The TLRP RCBN also commissioned Martyn Hammersley to produce a reference list on the implications of evidence-based practice and systematic review agendas for qualitative educational researchers. The resulting 2-page reference list can be viewed in MS Word. This includes references to recent debates about the quality, impact and relevance of education research in the UK and US.

Implications Of Evidence-Based Practice And Systematic Review Agendas For Qualitative Educational Researchers (Martyn Hammersley 2003)

Systematic Reviews and Meta-Analysis

It is always important when undertaking education research to question what constitutes evidence, the process by which evidence is generated, and how evidence is used to inform decisions about policy and practice. Systematic reviews and meta-analysis have, in recent times, become key tools for the generation of evidence to inform decision making in policy and practice. Techniques for systematic review and meta-analysis rest on the critical cumulation of relevant, high quality study findings to reach decisions about the state of the evidence in relation to a policy or practice relevant question. Both systematic reviews and meta-analysis emerged from the TLRP RCBN’s capacity building consultation exercise (2002-2005) as techniques for which there is high demand for capacity building and, respectively, medium and low current use. The development and use of these methods should involve consideration of such topics as: the scope of the evidence base, systematic literature searching, research quality assessment, relevance, impact, cumulation of research findings, programmes of research, techniques for the conduct of systematic reviews and meta-analyses of various sorts, and a critical evaluation of the appropriateness of these techniques for generating evidence.
The TLRP RCBN collated a number of references that discuss the role and practice of systematic reviews and meta-analysis. These are organised under four headings:

  • i. Conducting systematic reviews and meta-analysis
  • ii. Critiques and alternative / new approaches
  • iii. Literature searching
  • iv. Assessing the quality of studies for inclusion in systematic reviews and meta-analysis

(i) Conducting systematic reviews and meta-analysis

  • Altman, D. & Chalmers, I. (Eds) (1995) Systematic reviews. London: BMJ Publishing Group.
  • Badger, D., Nursten, J., Williams, P. & Woodward, M. (2000) Should all literature reviews be systematic? Evaluation and Research in Education, 14, 220-230.
  • Campbell Collaboration (2001) Campbell systematic reviews: Guidelines for the preparation of review protocols, 1. Available to download from the Campbell Collaboration web-site.
  • Clarke M. & Oxman A.D. (Eds) Cochrane Reviewers' Handbook 4.1.5 [updated April 2002]. In: The Cochrane Library, Issue 2, 2002. Oxford: Update Software. Updated quarterly. Available to download from the Cochrane Collaboration web-site.
  • Cooper, H. & Hedges, L.V. (Eds) (1994) The handbook of research synthesis. New York: Russell Sage Foundation.
  • Deeks, J., Glanville, J. & Sheldon, T. (1996) Undertaking systematic reviews of research on effectiveness: CRD guidelines for those carrying out or commissioning reviews. Centre for Reviews and Dissemination, York: York Publishing Services Ltd. Can be ordered from the NHS Centre for Reviews and Dissemination web-site.
  • EPPI-Centre (2001) Review Group Manual 1.1. London: EPPI-Centre, University of London. Available to download from the EPPI-Centre web-site.
  • Fleiss, J.L. (1993) The statistical basis of meta-analysis (review). Stat Methods Med Res, 2, 121-145.
  • Glass, G.V. (1976) Primary, secondary and meta-analysis of research. Educational Researcher, 5, 3-8.
  • Glass, G.V., McGraw, B. & Smith, M.L. (1981) Meta-analysis in social research. California: Sage.
  • Hardy, R.J. & Thompson, S.G. (1996) A likelihood approach to meta-analysis with random effects. Statistics in Medicine, 15, 619-629.
  • Hedges, L.V. & Olkin, I. (1985) Statistical methods for meta-analysis. London: Academic Press.
  • Larsquo;Abbe, K.A., Detsky, A.S. & O’Rourke, K. (1987) Meta-analysis in clinical research. Annals of Internal Medicine, 107, 224-233.
  • Light, R.J. & Pillemar, D.B. (1984) Summing up: the science of reviewing research. Cambridge, Mass: Harvard University Press.
  • Matt, G.E., Cook, T.D., Cooper, H. & Hedges, L.V. (Eds) (1994) The handbook of research synthesis. New York: Russell Sage Foundation.
  • Mengersen, K.L. & Tweedie, R.L. (1995) The impact of method choice in meta-analysis. Australian and New Zealand Journal of Statistics, 37, 19-44.
  • National Research Council (1992) Combining information: statistical opportunities for research. Washington DC: National Academy Press.
  • Olkin, I. (1996) Meta-analysis: current issues in research synthesis. Statistics in Medicine, 15, 1253-1257.
  • Sacks, H.S., Berrier, J., Reitman, D., Ancona-Berk, V.A. & Chalmers, T.C. (1987) Meta-analysis of randomized controlled trials. New England Journal of Medicine, 316, 450-455.
  • Sutton, A.J., Abrams, K.R., Jones, D.R., Sheldon, T.A. & Song, F. (1998) Systematic reviews of trials and other studies. Health Technology Assessment, 2(19). 276pp. Available to download from the Department of Health National Co-ordinating Centre for Health Technology Assessment web-site.
  • Thacker, S.B. (1988) Meta-analysis. A quantitative approach to research integration. Journal of the American Medical Association, 259, 1685-1689.

(ii) Critiques and alternative / new approaches

  • Abramson, J.H. (1990) Meta-analysis: a review of pros and cons. Pub Health Rev, 9, 149-151.
  • Elphick, H.E., Tan, A., Ashby, D. & Smyth, R.L. (2002) Systematic reviews and lifelong diseases. British Medical Journal, 325, 381-384.
  • Eysenck, H.J. (1994) Systematic reviews – meta-analysis and its problems. British Medical Journal, 309, 789-792.
  • Greenland, S. (1994) Invited commentary: a critical look at some popular meta-analytic methods. American Journal of Epidemiology, 140, 290-296.
  • Jones, D.R. (1992) Meta-analysis of observational epidemiological studies: a review. Journal of the Royal Society of Medicine, 85, 165-168.
  • Lau, J., Ionnidis, J.P.A. & Schmid, C.H. (1998) Summing up evidence: one answer is not always enough. The Lancet, 351, 123-127.
  • Letzel, H. (1995) ‘Best evidence synthesis: an intelligent alternative to meta-analysis’: discussion. A case of ‘either-or’ or ‘as well’ (comment). Journal of Clinical Epidemiology, 48, 9-18.
  • Olsen, O., Middleton, P., Ezzo, J., Gotzsche, P.C., Hadhazy, V., Herxheimer, A., Kleinjnen, J. & McIntosh, H. (2001) Quality of Cochrane reviews: assessment of sample from 1998. British Medical Journal, 323, 829-832.
  • Roberts, K.A., Dixon-Woods, M., Fitzpatrick, R., Abrams, K.R. & Jones, D.R. (2002) Factors affecting uptake of childhood immunisation: a Bayesian synthesis of qualitative and quantitative evidence. The Lancet, 360, 1596-1599.
  • Scruggs, T.E., Mastropieri, M.A. & Casto, G. (1987) The quantitative synthesis of single subject research: methodology and validation. Remedial and Special Education, 8, 24-33.
  • Slavin, R.E. (1986) Best-evidence synthesis: an alternative to meta-analytic and traditional reviews. Educational Researcher, 15, 5-11.
  • Slavin, R.E. (1995) Best evidence synthesis: an intelligent alternative to meta-analysis (review). Journal of Clinical Epidemiology, 48, 9-18.
  • Stewart, L.A. & Parmar, M.K. (1993) Meta-analysis of the literature or of individual patient data: is there a difference? Lancet, 341, 418-422.
  • Thompson, S.G. (1993) Controversies in meta-analysis: the case of the trials of serum cholesterol reduction (review). Statistical Methods in Medical Research, 2, 173-192.
  • Thompson, S.G. & Pocock, S.J. (1991) Can meta-analysis be trusted? Lancet, 338, 1127-1130.

(iii) Literature searching

  • Begg, C.B. and Berlin, J.A. (1988) Publication bias: a problem in interpreting medical data (with discussion). Journal of the Royal Statistical Society A, 151, 343-353.
  • Counsell, C. & Fraser, H. (1995) Identifying relevant studies for systematic reviews. British Medical Journal, 310, 126.
  • Coursol, A. & Wagner, E.E. (1986) Effect of positive findings on submission and acceptance rates: a note on meta-analysis bias. Professional Psychology, 17, 136-137.
  • Dickersin, K., Chan, S., Chalmers, T.C., Sacks, H.S. & Smith, H.J. (1987) Publication bias and clinical trials. Controlled Clinical Trials, 8, 343-353.
  • Dickersin, K., Scherer, R. & Lefebvre, C. (1994) Systematic reviews – identifying relevant studies for systematic reviews. British Medical Journal, 309, 1286-1291.
  • Duval, S. & Tweedie, R. (2000) Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455-63.
  • Egger, E., ZellwegerZahner, T., Schneider, M., Junker, C. & Lengeler, C. (1997) Language bias in randomised controlled trials published in English and German. Lancet, 350, 326-329.
  • Fleiss, J.L. & Gross, A.J. (1991) Meta-analysis in epidemiology, with special reference to studies of the association between exposure to environmental tobacco smoke and lung cancer: a critique. Journal of Clinical Epidemiology, 44, 127-139.
  • Rosenthal, R. (1979) The file drawer problem and tolerance for null results. Psychology Bulletin, 86, 638-641.
  • Smith, M.L. (1980) Publication bias and meta-analysis. Evaluation in Education, 4, 22-24.
  • Sutton, AJ., Duval, SJ., Tweedie, RL., Abrams, KR & Jones, DR. (2000) Empirical assessment of effect of publication bias on meta-analyses. British Medical Journal, 320, 1574-1575.

(iv) Assessing the quality of studies for inclusion in systematic reviews and meta-analysis

  • Cook, T.D. & Campbell, D.T. (1979) Quasi-experimentation: design and analysis issues for field settings. Boston: Houghton Mifflin.
  • Emerson, J.D., Burdick, E., Hoaglin, D.C., Mosteller, F. & Chalmers, T.C. (1990) An empirical study of the possible relation of treatment differences to quality scores in controlled randomized trials. Controlled Clinical Trials, 11, 339-352.
  • Khan, K.S., Daya, S. & Jadad, A.R. (1996) The importance of quality of primary studies in producing unbiased systematic reviews. Archives of Internal Medicine, 156, 661-666.
  • Moher, D., Jadad, A.R., Nichol, G., Penman, M., Tugwell, P. & Walsh, S. (1995) Assessing the quality of randomized controlled trials – an annotated bibliography of scales and checklists. Controlled Clinical Trials, 16, 62-73.
  • Schulz, K.F., Chalmers, I., Hayes, R.J. & Altman, D. (1995) Empirical evidence of bias: dimensions of methodological quality associated with estimates of treatment effects in controlled trials. Journal of the American Medical Association, 273, 408-412.

Systematic Reviews and Qualitative Research
The TLRP RCBN also commissioned Martyn Hammersley (Open University) to produce a reference list on the implications of evidence-based practice and systematic review agendas for qualitative educational researchers. The resulting 2-page reference list can be viewed in MS Word. This includes references to recent debates about the quality, impact and relevance of education research in the UK and US.

Implications Of Evidence-Based Practice And Systematic Review Agendas For Qualitative Educational Researchers (Martyn Hammersley 2003)

Meta-analysis and Qualitative Research
The following short article by Dr Ray Godfrey (Canterbury Christ Church University) considers the place of meta-analysis for qualitative research by drawing on historical debates about probability. The objective of combining (qualitative and/or quantitative) research studies faces no different challenges, Dr Godfrey argues, than those faced in the field of statistics.

Godfrey, R. (2004) Meta-analysis and qualitative data: Insights from the history of probability, Building Research Capacity, 7, 9-11

Other Useful Links/References
A TLRP example of a systematic review and meta-analysis conducted by Mark Newman into problem-based learning – Newman, M. (2003) A pilot systematic review and meta-analysis on the effectiveness of Problem Based Learning, Special Report 2, on behalf of the Campbell Collaboration Systematic Review Group on the effectiveness of Problem Based Learning. LTSN Medicine, Dentistry and Veterinary Medicine: Newcastle upon Tyne. ISBN 0 7017 0158 7

Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre) – The EPPI-Centre has been at the forefront of carrying out research synthesis and developing review methods in social science, public policy and education. The EPPI-Centre website provides further information on their work, methods for research synthesis, training courses and support in conducting systematic reviews and access to their Evidence Library which contains a large number of reviews in the fields of Education including Initial Teacher Education (ITE), Health Promotion and Public Health.

The Education Coordination Group of the internationally-recognised Campbell Collaboration – The Education Coordinating Group is an international network of volunteer professionals who prepare, update and rapidly disseminate systematic reviews of high-quality educational and training interventions conducted worldwide that are aimed to improve education and learning.

Centre for Reviews & Dissemination, University of York – The Centre for Reviews and Dissemination (CRD) was established in January 1994, and is now the largest group in the world engaged exclusively in evidence synthesis in the health field. It is a department of the University of York. The centre undertakes high quality systematic reviews that evaluate the effects of health and social care interventions and the delivery and organisation of health care.

The Cochrane Collaboration – The Cochrane Collaboration is an international not-for-profit and independent organization, dedicated to making up-to-date, accurate information about the effects of healthcare readily available worldwide. It produces and disseminates systematic reviews of healthcare interventions and promotes the search for evidence in the form of clinical trials and other studies of interventions. The Cochrane Collaboration produce the Cochrane Database of Systematic Reviews providing evidence for healthcare decision-making.

National Institute for Health Research (NIHR) Health Technology Assessment Programme – The HTA programme is part of the National Institute for Health Research (NIHR). It produces independent research information about the effectiveness, costs and broader impact of healthcare treatments and tests for those who plan, provide or receive care in the NHS.

Research project websites

A very useful way of ensuring that research can have some demonstrable impact is to establish and maintain a project internet website. This would allow anyone with internet access to find out more about the research project, the research team, and what the key findings have been. It could also include links to other research websites or resources. In this short article Nina Smalley and Dr Jonathan Scourfield (Cardiff University) briefly outline their advice on how best to design a research project website. They also provide some useful references for further guidance.

Smalley, N. and Scourfield, J. (2003) Setting up a research project website, Building Research Capacity, 6, 3-4

Building a Research Career

The TLRP RCBN put together a number of resources to help new education researchers build their research careers.
Building a Research Career – presentation by Professor Rosemary Deem (2003 University of Bristol) offering advice and guidance on how to build a successful research career. This presentation can be viewed in Adobe Acrobat:

General information
The Higher Education and Research Opportunities in the UK (HERO) site provides links to universities, colleges and research institutions in the UK. The site also provides information on the 2001 Research Assessment Exercise (RAE), as well as guidelines for applying for grant funding.

Finding Academic Jobs
There are several websites which advertise job vacancies. probably has the most comprehensive listing of academic and related vacancies.
A similar service is offered by Many institutions also advertise in The Guardian and The Times Higher Education Supplement.

Finding Academic Jobs Abroad is a meta-collection of internet resources that have been gathered for the academic jobs in the US. The site also provides links for academic jobs in Canada, Australia the UK, and many other countries. Graduate Careers Australia provides similar information for Australian institutions.

Applying for Jobs
Prospects is the UK’s official graduate careers website offers comprehensive advice on all aspects of the job seeking process, from compiling a CV and drafting a covering letter, to preparing for an interview or taking an aptitude test:!eefmd provides guidelines on planning and preparing for job interviews:
It is also worth looking at HE Institutional websites for career resources, such as the University of Strathclyde's Career Service web site:

Contract Research Staff
It is extremely likely that for anyone embarking on a research career will at some point be employed as a contract researcher, usually on one-off or rolling fixed term contracts. For some being a contract researcher is something you may do immediately after completing your research doctorate. For many others they have built their entire research career by moving from one fixed term research contract to another. Whatever the circumstances, there are many advantages and disadvantages of being a contract researcher in the field of education. In order to consider these and to address the difficulties of building a research career whilst on fixed term contracts the TLRP RCBN organised a conference to discuss these issues in depth. The conference concluded by presenting a number of recommendations to an invited audience of key stakeholders in the employment of education contract research staff. From the conference the TLRP RCBN published a special issue of Building Research Capacity to share the discussions and recommendations with a wider audience. The special issue also includes feedback from one of the key stakeholders who were asked to consider the implications of the conference recommendations from their respective constituency.
The first article introduces the debate on contract research staff.

Taylor, C. (2005) Special issue: Contract research staff and the TLRP, Building Research Capacity, 9, 1

The next three articles discuss the experiences of being a contract researcher, particularly within the context of working on TLRP research projects.

Whalberg, M. et al (2005) ‘Underground working’ – understanding hidden labour, Building Research Capacity, 9, 2-4
[hyperlink required]

Burt, R. and Moore, H. (2005) Music education research; Broadening horizons and enhancing research capacity for, and through, contract researchers, Building Research Capacity, 9, 4-5
[hyperlink required]

Taylor, C. (2005) The highs of being a contract researcher, Building Research Capacity, 9, 5-6
[hyperlink required]

There is then a presentation of the recommendations made by the collective enterprise of participants at the conference.

Life beyond the TLRP: outcomes and recommendations, Building Research Capacity, 9, 6-9
[hyperlink required]

Finally, Professor Stephen Baron (University of Strathclyde), then Associate Director of the TLRP with specific responsibility for research capacity building, offers a response to these recommendations.

Baron, S. (2005) Response from the ESRC TLRP, Building Research Capacity, 9, 9-10
[hyperlink required]

How to reference this page: Taylor, C. (2007) Resources from the Research Capacity Building Network and Journal. London: TLRP. Online at (accessed )

Creative Commons License TLRP Resources for Research in Education by Teaching and Learning Research Programme is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 2.0 UK: England & Wales License




homepage ESRC