Do medical students' scores using different assessment instruments predict their scores in clinical reasoning using a computer-based simulation?

被引:9
作者
Fida, Mariam [1 ]
Kassab, Salah Eldin [2 ]
机构
[1] Arabian Gulf Univ, Coll Med & Med Sci, Dept Mol Med, Manama, Bahrain
[2] Suez Canal Univ, Fac Med, Dept Med Educ, Ring Rd,POB 41111, Ismailia, Egypt
关键词
medical education; computer-based simulations; virtual patients; student assessment; PBL; Bahrain;
D O I
10.2147/AMEP.S77459
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Purpose: The development of clinical problem-solving skills evolves over time and requires structured training and background knowledge. Computer-based case simulations (CCS) have been used for teaching and assessment of clinical reasoning skills. However, previous studies examining the psychometric properties of CCS as an assessment tool have been controversial. Furthermore, studies reporting the integration of CCS into problem-based medical curricula have been limited. Methods: This study examined the psychometric properties of using CCS software (DxR Clinician) for assessment of medical students (n=130) studying in a problem-based, integrated multisystem module (Unit IX) during the academic year 2011-2012. Internal consistency reliability of CCS scores was calculated using Cronbach's alpha statistics. The relationships between students' scores in CCS components (clinical reasoning, diagnostic performance, and patient management) and their scores in other examination tools at the end of the unit including multiple-choice questions, short-answer questions, objective structured clinical examination (OSCE), and real patient encounters were analyzed using stepwise hierarchical linear regression. Results: Internal consistency reliability of CCS scores was high (alpha=0.862). Inter-item correlations between students' scores in different CCS components and their scores in CCS and other test items were statistically significant. Regression analysis indicated that OSCE scores predicted 32.7% and 35.1% of the variance in clinical reasoning and patient management scores, respectively (P < 0.01). Multiple-choice question scores, however, predicted only 15.4% of the variance in diagnostic performance scores (P < 0.01), while students' scores in real patient encounters did not predict any of the CCS scores. Conclusion: Students' scores in OSCE are the most important predictors of their scores in clinical reasoning and patient management using CCS. However, real patient encounter assessment does not appear to test a construct similar to what is tested in CCS.
引用
收藏
页码:135 / 141
页数:7
相关论文
共 20 条
[1]   Development and evaluation of high-fidelity simulation case scenarios for pediatric resident education [J].
Adler, Mark D. ;
Trainor, Jennifer L. ;
Siddall, Viva Jo ;
McGaghie, William C. .
AMBULATORY PEDIATRICS, 2007, 7 (02) :182-186
[2]   Editorial - Inverting the pyramid [J].
不详 .
ADVANCES IN HEALTH SCIENCES EDUCATION, 2005, 10 (02) :85-88
[3]  
*ASS AM MED COLL, 2007, EFF US ED TECHN MED
[4]   Sources of validity evidence for an internal medicine student evaluation system: an evaluative study of assessment methods [J].
Auewarakul, C ;
Downing, SM ;
Jaturatamrong, U ;
Praditsuwan, R .
MEDICAL EDUCATION, 2005, 39 (03) :276-283
[5]  
Barman A, 2005, ANN ACAD MED SINGAP, V34, P478
[6]   Scripts and medical diagnostic knowledge: Theory and applications for clinical reasoning instruction and research [J].
Charlin, BD ;
Tardif, J ;
Boshuizen, HPA .
ACADEMIC MEDICINE, 2000, 75 (02) :182-190
[7]   The introduction of computer-based case Simulations into the united states medical licensing examination [J].
Dillon, GF ;
Clyman, SG ;
Clauser, BE ;
Margolis, MJ .
ACADEMIC MEDICINE, 2002, 77 (10) :S94-S96
[8]   Evaluation of the clinical assessment project: A computer-based multimedia tool to assess problem-solving ability in medical students [J].
Feldman, Mitchell J. ;
Barnett, G. Octo ;
Link, David A. ;
Coleman, Margaret A. ;
Lowe, Janice A. ;
O'Rourke, Edward J. .
PEDIATRICS, 2006, 118 (04) :1380-1387
[9]   High fidelity simulation can discriminate between novice and experienced residents when assessing competency in patient care [J].
Girzadas, Daniel V., Jr. ;
Clay, Lamont ;
Caris, Jennifer ;
Rzechula, Kathleen ;
Harwood, Robert .
MEDICAL TEACHER, 2007, 29 (05) :472-476
[10]   Assessment of a clinical performance evaluation tool for use in a simulator-based testing environment: A pilot study [J].
Gordon, JA ;
Tancredi, DN ;
Binder, WD ;
Wilkerson, WM ;
Shaffer, DW .
ACADEMIC MEDICINE, 2003, 78 (10) :S45-S47