Setting Performance Standards for Mannequin-Based Acute-Care Scenarios An Examinee-Centered Approach

被引:38
作者
Boulet, John R. [1 ]
Murray, David [2 ]
Kras, Joseph [2 ]
Woodhouse, Julie [2 ]
机构
[1] Fdn Advancement Int Med Educ & Res, Philadelphia, PA 19104 USA
[2] Washington Univ, Sch Med, Dept Anesthesiol, St Louis, WA USA
来源
SIMULATION IN HEALTHCARE-JOURNAL OF THE SOCIETY FOR SIMULATION IN HEALTHCARE | 2008年 / 3卷 / 02期
关键词
Simulation; Assessment; Standard setting; Psychometrics; Performance assessment;
D O I
10.1097/SIH.0b013e31816e39e2
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Background: In medicine, standard setting methodologies have been developed for both selected-response and performance-based assessments. For simulation-based tasks, research efforts have been directed primarily at assessments that incorporate standardized patients. Mannequin-based evaluations often demand complex, time-sensitive, hierarchically ordered, sequential actions that are difficult to evaluate and score. Moreover, collecting reliable proficiency judgments, necessary to estimate meaningful cut points, can be challenging. The purpose of this investigation was to explore whether expert judgments obtained using an examinee-centered standard setting method that was previously validated for standardized patient-based assessments could be used to set defensible standards for acute-care, mannequin-based scenarios. Methods: Nineteen physicians were recruited to serve as panelists. For each of 12 simulation scenarios, between 8 and 10 performance samples (audio-video recordings), covering the expected ability continuum, were chosen for review. The performance samples were selected from a previously administered evaluation of postgraduate trainees. Based on a consensus definition of readiness to enter unsupervised practice, the panelists made independent judgments of each performance. For each scenario, the association between the panelists' judgments and the assessment scores was summarized and used to estimate a scenario-specific cut score. Results: For 9 of the scenarios, there was at least a moderately strong relationship between the aggregate panelists' rating and the performance scores, thus allowing for estimation of meaningful numeric standards. For the other 3 scenarios, the aggregate decision rules used by the panelists did not correspond with the achievement measures. For scenarios independently rated by split panels, the estimated cut scores were similar. Conclusions: An examinee-centered approach, using aggregate expert judgments of audio-video performances, was suitable for setting standards on most acute-care, mannequin-based scenarios. It is necessary, however, to have valid scores for the chosen scenarios and to sample performances across the ability spectrum. (Sim Healthcare 3: 72-81, 2008)
引用
收藏
页码:72 / 81
页数:10
相关论文
共 27 条
[1]   AMEE Guide No. 18: Standard setting in student assessment [J].
Ben-David, MF .
MEDICAL TEACHER, 2000, 22 (02) :120-130
[2]  
Boulet J. R., 2004, Simulations in critical care education and beyond, P119
[3]   Setting defensible performance standards on OSCEs and standardized patient examinations [J].
Boulet, JR ;
De Champlain, AF ;
McKinley, DW .
MEDICAL TEACHER, 2003, 25 (03) :245-249
[4]   Standard setting for clinical competence at graduation from medical school: A comparison of passing scores across five medical schools [J].
Boursicot, KAM ;
Roberts, TE ;
Pell, G .
ADVANCES IN HEALTH SCIENCES EDUCATION, 2006, 11 (02) :173-183
[5]   A comparison of two approaches for modeling expert judgment in scoring a performance assessment of physicians' patient-management skills [J].
Clauser, BE ;
Ross, LP ;
Fan, VY ;
Clyman, SG .
ACADEMIC MEDICINE, 1998, 73 (10) :S117-S119
[6]   The validity of performance assessments using simulation [J].
Devitt, JH ;
Kurrek, MM ;
Cohen, MM ;
Cleave-Hogg, D .
ANESTHESIOLOGY, 2001, 95 (01) :36-42
[7]   Simulations in the united states medical licensing Examination™ (USMLE™) [J].
Dillon, GF ;
Boulet, JR ;
Hawkins, RE ;
Swanson, DB .
QUALITY & SAFETY IN HEALTH CARE, 2004, 13 :I41-I45
[8]   Procedures for establishing defensible absolute passing scores on performance examinations in health professions education [J].
Downing, SM ;
Tekian, A ;
Yudkowsky, R .
TEACHING AND LEARNING IN MEDICINE, 2006, 18 (01) :50-57
[9]   Defining and assessing professional competence [J].
Epstein, RM ;
Hundert, EM .
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 2002, 287 (02) :226-235
[10]   Assessment of clinical performance during simulated crises using both technical and behavioral ratings [J].
Gaba, DM ;
Howard, SK ;
Flanagan, B ;
Smith, BE ;
Fish, KJ ;
Botney, R .
ANESTHESIOLOGY, 1998, 89 (01) :8-18