Evaluating technology enhanced learning by using single-case experimental design: A systematic review

被引:1
|
作者
Dayo, Nadira [1 ,2 ,3 ,4 ]
Metwaly, Sameh Said [1 ,2 ]
Van Den Noortgate, Wim [1 ,2 ]
机构
[1] Katholieke Univ Leuven, Fac Psychol & Educ Sci, Leuven, Belgium
[2] Katholieke Univ Leuven, ITEC Imec Res Grp, Leuven, Belgium
[3] Katholieke Univ Leuven, Fac Psychol & Educ Sci, KU Leuven campus Kulak Kortrijk,E Sabbelaan 51, B-8500 Kortrijk, Belgium
[4] Katholieke Univ Leuven, ITEC Imec Res Grp, KU Leuven campus Kulak Kortrijk,E Sabbelaan 51, B-8500 Kortrijk, Belgium
关键词
systematic review; single-case experimental design; technology-enhanced learning; AUTISM SPECTRUM DISORDERS; SUBJECT RESEARCH; CHILDREN; STUDENTS; INTERVENTIONS; TRENDS; TUTOR;
D O I
10.1111/bjet.13468
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Single-case experimental designs (SCEDs) may offer a reliable and internally valid way to evaluate technology-enhanced learning (TEL). A systematic review was conducted to provide an overview of what, why and how SCEDs are used to evaluate TEL. Accordingly, 136 studies from nine databases fulfilling the inclusion criteria were included. The results showed that most of the studies were conducted in the field of special education focusing on evaluating the effectiveness of computer-assisted instructions, video prompts and mobile devices to improve language and communication, socio-emotional, skills and mental health. The research objective of most studies was to evaluate the effects of the intervention; often no specific justification for using SCED was provided. Additionally, multiple baseline and phase designs were the most common SCED types, with most measurements in the intervention phase. Frequent data collection methods were observation, tests, questionnaires and task analysis, whereas, visual and descriptive analysis were common methods for data analysis. Nearly half of the studies did not acknowledge any limitations, while a few mentioned generalization and small sample size as limitations. The review provides valuable insights into utilizing SCEDs to advance TEL evaluation methodology and concludes with a reflection on further opportunities that SCEDs can offer for evaluating TEL.Practitioner notesWhat is already known about this topicWhat this paper addsImplications for practice and/or policy SCEDs use multiple measurements to study a single participant over multiple conditions, in the absence and presence of an intervention SCEDs can be rigorous designs for evaluating behaviour change caused by any intervention, including for testing technology-based interventions. Reveals patterns, trends and gaps in the use of SCED for TEL. Identifies the study disciplines, EdTech tools and outcome variables studied using SCEDs. Provides a comprehensive understanding of how SCEDs are used to evaluate TEL by shedding light on methodological techniques. Enriches insights about justifications and limitations of using SCEDs for TEL. Informs about the use of the rigorous method, SCED, for evaluation of technology-driven interventions across various disciplines. Contributes therefore to the quality of an evidence base, which provides policymakers, and different stakeholders a consolidated resource to design, implement and decide about TEL.
引用
收藏
页码:2457 / 2477
页数:21
相关论文
共 50 条
  • [1] Evaluating single-case research data for systematic review: A commentary for the special issue
    Maggin, Daniel M.
    Odom, Samuel L.
    JOURNAL OF SCHOOL PSYCHOLOGY, 2014, 52 (02) : 237 - 241
  • [2] Direct Training to Improve Educators' Treatment Integrity: A Systematic Review of Single-Case Design Studies
    Fallon, Lindsay M.
    Kurtz, Kathryn D.
    Mueller, Marlana R.
    SCHOOL PSYCHOLOGY QUARTERLY, 2018, 33 (02) : 169 - 181
  • [3] Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
    Smith, Justin D.
    PSYCHOLOGICAL METHODS, 2012, 17 (04) : 510 - 550
  • [4] Methodological quality of meta-analyses of single-case experimental studies
    Jamshidi, Laleh
    Heyvaert, Mieke
    Declercq, Lies
    Fernandez-Castilla, Belen
    Ferron, John M.
    Moeyaert, Mariola
    Beretvas, S. Natasha
    Onghena, Patrick
    Van den Noortgate, Wim
    RESEARCH IN DEVELOPMENTAL DISABILITIES, 2018, 79 : 97 - 115
  • [5] The design of technology-enhanced vocabulary learning: A systematic review
    Zhou, Keyi
    Jin, Fangzhou
    Li, Weiwei
    Song, Zicong
    Huang, Xianhan
    Lin, Chin-Hsi
    EDUCATION AND INFORMATION TECHNOLOGIES, 2024, 29 (12) : 14875 - 14897
  • [6] Single-Case Experimental Design for Evaluating Basic Reading Interventions for Students With Learning Disabilities: An Illustrative Case
    Berkeley, Sheri
    Hock, Michael
    Washburn, Jocelyn
    LEARNING DISABILITY QUARTERLY, 2023, 46 (01) : 6 - 16
  • [7] Single-Case Experimental Designs in Educational Research: A Methodology for Causal Analyses in Teaching and Learning
    Plavnick, Joshua B.
    Ferreri, Summer J.
    EDUCATIONAL PSYCHOLOGY REVIEW, 2013, 25 (04) : 549 - 569
  • [8] A Systematic Quality Review of Single-Case Experimental Designs Using Acceptance and Commitment Therapy in Adult Clinical Populations
    Luck, Safia A. M.
    Golijani-Moghaddam, Nima
    Dawson, David L.
    BEHAVIOR THERAPY, 2024, 55 (04) : 856 - 871
  • [9] Single-Case Experimental Design in Rehabilitation Basic Concepts, Advantages, and Challenges
    Yang, Lujia
    Armijo-Olivo, Susan
    Gross, Douglas P.
    AMERICAN JOURNAL OF PHYSICAL MEDICINE & REHABILITATION, 2023, 102 (05) : 454 - 460
  • [10] Single-case design meta-analyses in education and psychology: a systematic review of methodology
    Moeyaert, Mariola
    Dehghan-Chaleshtori, Marzieh
    Xu, Xinyun
    Yang, Panpan
    FRONTIERS IN RESEARCH METRICS AND ANALYTICS, 2023, 8