Gathering Validity Evidence for Surgical Simulation A Systematic Review

被引:113
作者
Borgersen, Nanna Jo [1 ,2 ,3 ]
Naur, Therese M. H. [1 ,3 ]
Sorensen, Stine M. D. [1 ,3 ]
Bjerrum, Flemming [1 ,4 ]
Konge, Lars [1 ,3 ]
Subhi, Yousif [2 ,3 ]
Thomsen, Ann Sofia S. [1 ,5 ]
机构
[1] Copenhagen Acad Med Educ & Simulat, Copenhagen, Denmark
[2] Zealand Univ Hosp, Dept Ophthalmol, Roskilde, Denmark
[3] Univ Copenhagen, Fac Hlth & Med Sci, Copenhagen, Denmark
[4] Herlev Gentofte Hosp, Dept Surg, Herlev, Denmark
[5] Rigshosp Glostrup, Dept Ophthalmol, Copenhagen, Denmark
关键词
Messick framework; surgical simulation; validity framework; validity; OBJECTIVE STRUCTURED ASSESSMENT; TECHNOLOGY-ENHANCED SIMULATION; STANDARDS; GUIDE; PERFORMANCE; VALIDATION; EDUCATION;
D O I
10.1097/SLA.0000000000002652
中图分类号
R61 [外科手术学];
学科分类号
摘要
Objective: To identify current trends in the use of validity frameworks in surgical simulation, to provide an overview of the evidence behind the assessment of technical skills in all surgical specialties, and to present recommendations and guidelines for future validity studies. Summary of Background Data: Validity evidence for assessment tools used in the evaluation of surgical performance is of paramount importance to ensure valid and reliable assessment of skills. Methods: We systematically reviewed the literature by searching 5 databases (PubMed, EMBASE, Web of Science, PsycINFO, and the Cochrane Library) for studies published from January 1, 2008, to July 10, 2017. We included original studies evaluating simulation-based assessments of health professionals in surgical specialties and extracted data on surgical specialty, simulator modality, participant characteristics, and the validity framework used. Data were synthesized qualitatively. Results: We identified 498 studies with a total of 18,312 participants. Publications involving validity assessments in surgical simulation more than doubled from 2008 to 2010 (similar to 30studies/year) to 2014 to 2016 (similar to 70 to 90studies/year). Only 6.6% of the studies used the recommended contemporary validity framework (Messick). The majority of studies used outdated frameworks such as face validity. Significant differences were identified across surgical specialties. The evaluated assessment tools were mostly inanimate or virtual reality simulation models. Conclusion: An increasing number of studies have gathered validity evidence for simulation-based assessments in surgical specialties, but the use of outdated frameworks remains common. To address the current practice, this paper presents guidelines on how to use the contemporary validity framework when designing validity studies.
引用
收藏
页码:1063 / 1068
页数:6
相关论文
共 28 条
[1]   Generalizability theory for the perplexed: A practical introduction and guide: AMEE Guide No. 68 [J].
Bloch, Ralph ;
Norman, Geoffrey .
MEDICAL TEACHER, 2012, 34 (11) :960-992
[2]   Linking Simulation-Based Educational Assessments and Patient-Related Outcomes: A Systematic Review and Meta-Analysis [J].
Brydges, Ryan ;
Hatala, Rose ;
Zendejas, Benjamin ;
Erwin, Patricia J. ;
Cook, David A. .
ACADEMIC MEDICINE, 2015, 90 (02) :246-256
[3]   Consequences Validity Evidence: Evaluating the Impact of Educational Assessments [J].
Cook, David A. ;
Lineberry, Matthew .
ACADEMIC MEDICINE, 2016, 91 (06) :785-795
[4]   Much ado about differences: why expert-novice comparisons add little to the validity argument [J].
Cook, David A. .
ADVANCES IN HEALTH SCIENCES EDUCATION, 2015, 20 (03) :829-834
[5]   A contemporary approach to validity arguments: a practical guide to Kane's framework [J].
Cook, David A. ;
Brydges, Ryan ;
Ginsburg, Shiphra ;
Hatala, Rose .
MEDICAL EDUCATION, 2015, 49 (06) :560-575
[6]   When I say ... validity [J].
Cook, David A. .
MEDICAL EDUCATION, 2014, 48 (10) :948-949
[7]   What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment [J].
Cook, David A. ;
Zendejas, Benjamin ;
Hamstra, Stanley J. ;
Hatala, Rose ;
Brydges, Ryan .
ADVANCES IN HEALTH SCIENCES EDUCATION, 2014, 19 (02) :233-250
[8]   Technology-Enhanced Simulation to Assess Health Professionals: A Systematic Review of Validity Evidence, Research Methods, and Reporting Quality [J].
Cook, David A. ;
Brydges, Ryan ;
Zendejas, Benjamin ;
Hamstra, Stanley J. ;
Hatala, Rose .
ACADEMIC MEDICINE, 2013, 88 (06) :872-883
[9]   Technology-Enhanced Simulation for Health Professions Education A Systematic Review and Meta-analysis [J].
Cook, David A. ;
Hatala, Rose ;
Brydges, Ryan ;
Zendejas, Benjamin ;
Szostek, Jason H. ;
Wang, Amy T. ;
Erwin, Patricia J. ;
Hamstra, Stanley J. .
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 2011, 306 (09) :978-988
[10]   The reporting of methodological factors in randomized controlled trials and the association with a journal policy to promote adherence to the Consolidated Standards of Reporting Trials (CONSORT) checklist [J].
Devereaux, PJ ;
Manns, BJ ;
Ghali, WA ;
Quan, H ;
Guyatt, GH .
CONTROLLED CLINICAL TRIALS, 2002, 23 (04) :380-388