How Methodological Features Affect Effect Sizes in Education

被引:303
作者
Cheung, Alan C. K. [1 ,2 ]
Slavin, Robert E. [3 ]
机构
[1] Chinese Univ Hong Kong, Fac Educ, Dept Educ Adm & Policy, Room 422,Ho Tim Bldg, Shatin, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Fac Educ, CUSP, Room 422,Ho Tim Bldg, Shatin, Hong Kong, Peoples R China
[3] Johns Hopkins Univ, Sch Educ, 300 East Joppa Rd, Baltimore, MD 21286 USA
关键词
disparities; effect size; program evaluation; research methodology; research utilization; EFFECTIVE READING PROGRAMS; TECHNOLOGY APPLICATIONS; SYSTEMATIC REVIEWS; STRUGGLING READERS; MATHEMATICS; OUTCOMES; INTERVENTIONS; ENGLISH; ISSUES;
D O I
10.3102/0013189X16656615
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
As evidence becomes increasingly important in educational policy, it is essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. A total of 645 studies from 12 recent reviews of evaluations of preschool, reading, mathematics, and science programs were studied. Effect sizes were roughly twice as large for published articles, small-scale trials, and experimenter-made measures, compared to unpublished documents, large-scale studies, and independent measures, respectively. Effect sizes were significantly higher in quasi-experiments than in randomized experiments. Excluding tutoring studies, there were no significant differences in effect sizes between elementary and middle/high studies. Regression analyses found that effects of all factors maintained after controlling for all other factors. Explanations for the effects of methodological features on effect sizes are discussed, as are implications for evidence-based policy.
引用
收藏
页码:283 / 292
页数:10
相关论文
共 42 条
[11]   SHOULD UNPUBLISHED DATA BE INCLUDED IN METAANALYSES - CURRENT CONVICTIONS AND CONTROVERSIES [J].
COOK, DJ ;
GUYATT, GH ;
RYAN, G ;
CLIFTON, J ;
BUCKINGHAM, L ;
WILLAN, A ;
MCLLROY, W ;
OXMAN, AD .
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 1993, 269 (21) :2749-2753
[12]  
Cronbach L.J., 1980, REFORM PROGRAM EVALU
[13]   Effects of the Attributes of Educational Interventions on Students' Academic Performance: A Meta-Analysis [J].
de Boer, Hester ;
Donker, Anouk S. ;
van der Werf, Margaretha P. C. .
REVIEW OF EDUCATIONAL RESEARCH, 2014, 84 (04) :509-545
[14]   A Synthesis of Reading Interventions and Effects on Reading Comprehension Outcomes for Older Struggling Readers [J].
Edmonds, Meaghan S. ;
Vaughn, Sharon ;
Wexler, Jade ;
Reutebuch, Colleen ;
Cable, Amory ;
Tackett, Kathryn Klingler ;
Schnakenberg, Jennifer Wick .
REVIEW OF EDUCATIONAL RESEARCH, 2009, 79 (01) :262-300
[15]   A Vast Graveyard of Undead Theories: Publication Bias and Psychological Science's Aversion to the Null [J].
Ferguson, Christopher J. ;
Heene, Moritz .
PERSPECTIVES ON PSYCHOLOGICAL SCIENCE, 2012, 7 (06) :555-561
[16]  
Glass GV., 1981, Meta-analysis in social research
[17]  
Haskins R., 2014, Show me the evidence: Obama's fight for rigor and results in social policy
[18]   Assignment methods in experimentation: When do nonrandomized experiments approximate answers from randomized experiments? [J].
Heinsman, DT ;
Shadish, WR .
PSYCHOLOGICAL METHODS, 1996, 1 (02) :154-169
[19]   Handsearching versus electronic searching to identify reports of randomized trials [J].
Hopewell, S. ;
Clarke, M. ;
Lefebvre, C. ;
Scherer, R. .
COCHRANE DATABASE OF SYSTEMATIC REVIEWS, 2007, (02)
[20]   A Meta-analysis of the Effects of Computer Technology on School Students' Mathematics Learning [J].
Li, Qing ;
Ma, Xin .
EDUCATIONAL PSYCHOLOGY REVIEW, 2010, 22 (03) :215-243