TLRP & ESRC  
  home  news  search  vre  contact  sitemap
 AIMS
 FINDINGS
 PROJECTS
 THEMES
 CAPACITY
 EVENTS
 PUBLICATIONS
 RESOURCES
USERS
 INTERNATIONAL
 MANAGEMENT
  
 Capacity building resources
   

Experimental Designs

Stephen Gorard

image

Stephen is ...


Contents  

1. Introduction
            1.1 We all need trials
            1.2 The basic idea
2. The value of trials
            2.1 Comparators
            2.2 Matching groups
            2.3 Causal models
3. Simple trial designs
            3.1 Sampling
            3.2 Design
4. Related issues
            4.1 Ethical considerations
            4.2 Warranting conclusions
            4.3 The full cycle of research
5. Resources for those conducting trials
            5.1 Reporting trials
            5.2 Tips for those in the field
6. Alternatives to trials
            6.1 Regression discontinuity
            6.2 Design experiments
7. Some examples of trials
            7.1 STAR
            7.2 High/Scope
            7.3 ICT
8. References

How to reference this page

8. References

Adair, J. (1973) The Human Subject, Boston: Little, Brown and Co.

Arjas, E. (2001) Causal analysis and statistics: a social sciences perspective, European Sociological Review, 17, 1, 59-64

Ayer, A. (1972) Russell, London: Fontana

Barnett, W. (1996) Lives in the balance: Age-27 benefit-cost analysis of the High/Scope Perry Preschool Program (Monographs of the High/Scope Educational Research Foundation, 11), Ypsilanti , MI : High/Scope Press

Booth, W., Colomb, G. and Williams, J. (1995) The craft of research, Chicago: University of Chicago Press

Boruch, R., and Mosteller, F. (2002) Overview and new directions, in Mosteller, F. and Boruch, R. (Eds) Evidence Matters: Randomized Trials in Education Research, Washington: Brookings Institution Press

Boyd-Zaharius, J. (1999) Project STAR: The story of the Tennessee class size study, American Educator, 1-6

Brighton, M. (2000) Making our measurements count, Evaluation and Research in Education, 14, 3and4, 124-135

Brignell, J. (2000) Sorry, wrong number! The abuse of measurement, European Science and Environment Forum

Brooks, G., Miles, J., Torgerson, C. and Torgerson, D. (2006) Is an intervention using computer software effective in literacy learning? A randomised controlled trial, Educational Studies, 32, 1

Brown C. and Lilford R. (2006) The stepped wedge design: a systematic revie, BMC Medical Research Methodology 6, 54

Brown, A. (1992) Design experiments: theoretical and methodological challenges in creating complex interventions in classroom settings, The Journal of the Learning Sciences, 2, 2, 141-178

Bryman, A. (2001) Social research methods, Oxford: Oxford University Press

Butler, R. (1988) Enhancing and undermining intrinsic motivation: the effects of task-involving evaluation on interest and performance, British Journal of Educational Psychology, 58, 1-14

Campbell, D. and Stanley, J. (1963) Experimental and quasi-experimental designs for research, Boston: Houghton Mifflin

Campbell, M., Fitzpatrick, R., Haines, A., Kinmouth, A., Sandercock, P., Spiegelhalter, D. and Tyrer, P. (2000) Framework for design and evaluation of complex interventions to improve health, British Medical Journal, 321, 694-6

Carlin, J., Taylor, P. and Nolan, T. (1998) School based bicycle safety education and bicycle injuries in children: a case control study, Injury Prevention, 4, 22-7

Carter, H. (2000) NHS helpline offers bad advice, survey claims, The Guardian, 8/8/00, p.2

Cobb, P., Confrey, J., diSessa, A., Lehrer, R. and Schauble, L. (2003) Design experiments in educational research, Educational Researcher, 32, 1, 9-13

Cohen, L. Manion, L. and Morrison, K. (2007) Research methods in education, London: Routledge

Collins, A. (1992) Toward a design science of education. In E. Scanlon and T. O’Shea (Eds.), New directions in educational technology. New York: Springer-Verlag.

Cook, T. (2002) Randomized experiments in educational policy research: a critical examination of the reasons the educational evaluation community has offered for not doing them, Educational Evaluation and Policy Analysis, 24, 3, 175-199

Cook, T. and Campbell, D. (1979) Quasi-experimentation: design and analysis issues for field settings, Chicago: Rand McNally

Cook, T. and Gorard, S. (2007) What counts and what should count as evidence, pp.33-49 in OECD (Eds.) Evidence in education: Linking research and policy, Paris: OECD

Cox, D. and Wermuth, N. (2001) Some statistical aspects of causality, European Sociological Review, 17, 1, 65-74

de Corte, E., Verschaffel, L. and van De Ven, A. (2001) Improving text comprehension strategies in upper primary school children: a design experiment, British Journal of Educational Psychology, 71, 531-559

de Leon, G., Inciardi, J. and Martin, S. (1995) Residential drug abuse treatment research: are conventional control designs appropriate for assessing treatment effectiveness?, Journal of Psychoactive Drugs, 27, 85-91

de Vaus, D. (2001) Research design in social science, London: Sage

Dehejia, R. and Wahba, S. (1999) Causal effects in nonexperimental studies: reevaluating the evaluation of training programs, Journal of the American Statistical Association, 94, 448, 1053-1062

Ercikan, K. and Wolff-Michael, R. (2006) What good is polarizing research into qualitative and quantitative?, Educational Researcher, 35, 5, 14-23

Fisher, R. (1935) The design of experiments, Edinburgh: Oliver and Boyd

Fitz-Gibbon, C. (2000) Education: realising the potential, in Davies, H., Nutley, S. and Smith, P. (Eds.) What works? Evidence-based policy and practice in public services, Bristol: Policy Press

Fitz-Gibbon, C. (2001) What's all this about 'evidence'?, Learning and Skills Research, 5, 1, 27-29

Fitz-Gibbon, C. (2003) Milestones en route to evidence-based policies, Research Papers in Education, 18, 4, 313-329

Gambetta, D. (1987) Were they pushed or did they jump? Individual decision mechanisms in education, London: Cambridge University Press

Garrison, R. (1993) Mises and his methods, pp.102-117 in Herbener, J. (Ed.) The meaning of Ludwig von Mises: contributions in economics, sociology, epistemology, and political philosophy, Boston: Kluwer Academic Publishers

Ghouri, N. (1999) Football approach risks an own goal, Times Educational Supplement, 4/6/99, p. 9

Gleick, J. (1988) Chaos, London: Heinemann

Glymour, C., Scheines, R., Spirtes, P. and Kelly, K. (1987) Discovering causal structure, Orlando: Academic Press

Goldthorpe, J. (2001) Causation, statistics, and sociology, European Sociological Review, 17, 1, 1-20

Goodson, V. (2002) Does the method of assessing affect performance? TERSE Report http://ceem.dur.ac.uk/ebeuk/research/terse/Goodson.htm (15th December 2002)

Gorard, S. (2002a) The role of causal models in education as a social science, Evaluation and Research in Education, 16, 1, 51-65

Gorard, S. (2002b) Ethics and equity: pursuing the perspective of non-participants, Social Research Update, 39, 1-4

Gorard, S. (2002c) Fostering scepticism: the importance of warranting claims, Evaluation and Research in Education, 16, 3, 136-149

Gorard, S. (2003) Quantitative methods in social science: the role of numbers made easy, London: Continuum

Gorard, S. (2005) Current contexts for research in educational leadership and management, Educational Management Administration and Leadership, 33, 2, 155-164

Gorard, S. (2006) Towards a judgement-based statistical analysis, British Journal of Sociology of Education, 27, 1, 67-80

Gorard, S. (2007) The dubious benefits of multi-level modelling, International Journal of Research and Method in Education, 30, 2, 221-236

Gorard, S. and Roberts, K. (2004) What kind of creature is a design experiment?, British Educational Research Journal, 30, 3

Gorard, S., Rushforth, K. and Taylor, C. (2004) Is there a shortage of quantitative work in education research?, Oxford Review of Education, 30, 3, 371-395

Gorard, S., with Taylor, C. (2004) Combining methods in educational and social research, London: Open University Press

Hammond, P. and Yeshanew, T. (2007) The impact of feedback on school performance, Educational Studies, 33, 2, 99-113

Hedges, L. (2000) Using converging evidence in policy formation: The case of class size research in Evaluation and Research in Education, 14 (3 and 4):193-205

Heise, D. (1975) Causal analysis, New York: John Wiley

Hendry, D. and Mizon, G. (1999) The pervasiveness of Granger causality in econometrics, Nuffield College Oxford, (mimeo)

Huck, S. and Sandler, H., (1979) Rival hypotheses: Alternative interpretations of data based conclusions, New York:

Harper and Row Hume, D. (1962) On Human Nature and the Understanding, New York: Collier

Johnson, B. (2001) Towards a new classification of nonexperimental quantitative research, Educational Researcher, 30, 2, 3-14

Kelly, A. (2003) Research as design, Educational Researcher, 31, 1, 3-4

Kelly, A. and Lesh, R. (2002) Understanding and Explicating the design experiment methodology, Building Research Capacity, 3, 1-3

Luyten, H. (2006) An empirical assessment of the absolute effect of schooling: regression-discontinuity applied to TIMSS-95, Oxford Review of Education, 32, 3, 397-429

Malacova, E. (2007) Effects of single-sex education on progress in GCSE, Oxford Review of Education, 33, 2, 233-259

McKim, V. and Turner, S. (1997) Causality in crisis? Statistical methods and the search for causal knowledge in the social sciences, Indiana: University of Notre Dame Press

Moore, L. (2002) Lessons from using randomised trials in health promotion, Building Research Capacity, 1, 1, 4-5

Morgan, C. (1903) Introduction to comparative psychology, London: Walter Scott

Mosteller, F. (1995) The Tennessee study of class size in the early school grades, The Future of Children: Criticial Issues for Children and Youths, 5, 2, 113-127

National Research Council (2002) Scientific research in education, Washington DC: National Academy Press

Nutbeam, D, Macaskill, P, Smith, C, et. al. (1993) Evaluation of two school smoking programmes under normal classroom conditions, British Medical Journal, 306, 102-107

Petrosino, A., Turpin-Petrosino, C. and Finckenauer, J. (2000) Programs can have harmful effects!: Lessons from experiments of programs such as scared straight. Crime and Delinquency, 46, 1, 354-379

Pötter, U. and Blossfeld, H. (2001) Causal inference from series of events, European Sociological Review, 17, 1, 21-32

Robinson, D., Levin, J., Thomas, G., Pituch, K. and Vaughn, S. (2007) The incidence of ‘causal’ statements in teaching-and-learning research journals, American Educational Research Journal, 44, 2, 400-413

Rosenbaum, P. and Rubin, D. (1983) The central role of the propensity score in observational studies for causal effects, Biometrika, 70, 1, 41-55

Rouse, C.E., Krueger, A.B and Markham, L., Putting computerized instruction to the test: A randomized evaluation of a ‘scientifically-based’ reading program, National Bureau of Economic Research, 2004

Salmon, W. (1998) Causality and explanation, New York: Oxford University Press

Schweinhart, L. J., Montie, J., Xiang, Z., Barnett, W. S., Belfield, C. R., and Nores, M. (2005). Lifetime effects: The High/Scope Perry Preschool study through age 40. (Monographs of the High/Scope Educational Research Foundation, 14). Ypsilanti , MI : High/Scope Press.

Schweinhart, L., Barnes, H., and Weikart, D. P. (1993) Significant benefits: The High/Scope Perry Preschool study through age 27 (Monographs of the High/Scope Educational Research Foundation, 10), Ypsilanti : High/Scope Press

Sloane, F. and Gorard, S. (2003) Exploring modeling aspects of design experiments, Educational Researcher, 31, 1, 29-31

Somekh, B. and Lewis, C. (2005) Research methods in the social sciences, London: Sage

Torgerson, C. and Torgerson, D. (2001) The need for randomised controlled trials in educational research, British Journal of Educational Studies, 49, 3, 316-328

Torgerson, C., Torgerson, D., Birks, Y., Porthouse, J. (2005) A comparison of thequality of randomised controlled trials in education and health, British Educational Research Journal, 35, 1

Torgerson, D. and Torgerson, C. (2008) Designing randomised trials in health, education and the social sciences, Basingstoke: Palgrave Macmillan

Zaritsky, R., Kelly, A., Flowers, W., Rogers, E. and O’Neill, P. (2003) Clinical design sciences: a view from sister design efforts, Educational Researcher, 31, 1, 32-34

 

How to reference this page: Gorard, S. (2007) Experimental Designs. London: TLRP. Online at http://www.tlrp.org/capacity/rm/wt/gorard (accessed )

Creative Commons License TLRP Resources for Research in Education by Teaching and Learning Research Programme is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 2.0 UK: England & Wales License

 


 
   

 

 
homepage ESRC