Measurement-Comparable Effect Sizes for Single-Case Studies of Free-Operant Behavior

被引:59
作者
Pustejovsky, James E. [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
关键词
effect size; single-case research; free-operant behavior; alternating renewal process; CHOICE-MAKING OPPORTUNITIES; DIFFERENCE EFFECT SIZE; QUANTITATIVE SYNTHESIS; SUBJECT RESEARCH; METAANALYSIS; STUDENTS; INTERVENTIONS; MODELS; METHODOLOGY; SCHOOL;
D O I
10.1037/met0000019
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Single-case research comprises a set of designs and methods for evaluating the effects of interventions, practices, or programs on individual cases, through comparison of outcomes measured at different points in time. Although there has long been interest in meta-analytic techniques for synthesizing single-case research, there has been little scrutiny of whether proposed effect sizes remain on a directly comparable metric when outcomes are measured using different operational procedures. Much of single-case research focuses on behavioral outcomes in free-operant contexts, which may be measured using a variety of different direct observation procedures. This article describes a suite of effect sizes for quantifying changes in free-operant behavior, motivated by an alternating renewal process model that allows measurement comparability to be established in precise terms. These effect size metrics have the advantage of comporting with how direct observation data are actually collected and summarized. Effect size estimators are proposed that are applicable when the behavior being measured remains stable within a given treatment condition. The methods are illustrated by 2 examples, including a re-analysis of a systematic review of the effects of choice-making opportunities on problem behavior.
引用
收藏
页码:342 / 359
页数:18
相关论文
共 70 条
[61]   PND at 25: Past, Present, and Future Trends in Summarizing Single-Subject Research [J].
Scruggs, Thomas E. ;
Mastropieri, Margo A. .
REMEDIAL AND SPECIAL EDUCATION, 2013, 34 (01) :9-19
[62]  
Seybert S., 1996, Journal of Behavioral Education, V6, P49, DOI DOI 10.1007/BF02110477
[63]  
Shadish W.R., 2007, New Directions For Evaluation, P95, DOI [DOI 10.1002/EV.217, 10.1002/ev.369, DOI 10.1002/EV.369]
[64]   Analyzing Data From Single-Case Designs Using Multilevel Models: New Applications and Some Agenda Items for Future Research [J].
Shadish, William R. ;
Kyse, Eden Nagler ;
Rindskopf, David M. .
PSYCHOLOGICAL METHODS, 2013, 18 (03) :385-405
[65]   The effect of choice-making as an intervention for problem behavior: A meta-analysis [J].
Shogren, KA ;
Faggella-Luby, MN ;
Bae, SJ ;
Wehmeyer, ML .
JOURNAL OF POSITIVE BEHAVIOR INTERVENTIONS, 2004, 6 (04) :228-237
[66]   An effect size measure and Bayesian analysis of single-case designs [J].
Swaminathan, Hariharan ;
Rogers, H. Jane ;
Horner, Robert H. .
JOURNAL OF SCHOOL PSYCHOLOGY, 2014, 52 (02) :213-230
[67]   Hierarchical linear models for the quantitative integration of effect sizes in single-case research [J].
Van den Noortgate, W ;
Onghena, P .
BEHAVIOR RESEARCH METHODS INSTRUMENTS & COMPUTERS, 2003, 35 (01) :1-10
[68]  
Van den Noortgate W., 2008, Evidence-Based Communication Assessment and Intervention, V2, P142, DOI [DOI 10.1080/17489530802505362, https://doi.org/10.1080/17489530802505362]
[69]   Conducting Meta-Analyses in R with the metafor Package [J].
Viechtbauer, Wolfgang .
JOURNAL OF STATISTICAL SOFTWARE, 2010, 36 (03) :1-48
[70]   Comparison of Overlap Methods for Quantitatively Synthesizing Single-Subject Data [J].
Wolery, Mark ;
Busick, Matthew ;
Reichow, Brian ;
Barton, Erin E. .
JOURNAL OF SPECIAL EDUCATION, 2010, 44 (01) :18-28