Pattern recognition as a concept for multiple-choice questions in a national licensing exam

被引:20
作者
Freiwald, Tilo [1 ]
Salimi, Madjid [2 ]
Khaljani, Ehsan [3 ]
Harendza, Sigrid [4 ]
机构
[1] Goethe Univ Hosp, Med Clin 3, Dept Nephrol, D-60590 Frankfurt, Germany
[2] MIAMED GmbH, D-50677 Cologne, Germany
[3] Vivantes Auguste Viktoria Clin, Dept Urol, D-12157 Berlin, Germany
[4] Univ Hosp Hamburg Eppendorf, Dept Internal Med, D-20246 Hamburg, Germany
关键词
Multiple-choice questions; Pattern recognition; Clinical reasoning; eLearning; ITEM; PERFORMANCE; COMPETENCE; EXPERIENCE; EDUCATION;
D O I
10.1186/1472-6920-14-232
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Background: Multiple-choice questions (MCQ) are still widely used in high stakes medical exams. We wanted to examine whether and to what extent a national licensing exam uses the concept of pattern recognition to test applied clinical knowledge. Methods: We categorized all 4,134 German National medical licensing exam questions between October 2006 and October 2012 by discipline, year, and type. We analyzed questions from the four largest disciplines: internal medicine (n = 931), neurology (n = 305), pediatrics (n = 281), and surgery (n = 233), with respect to the following question types: knowledge questions (KQ), pattern recognition questions (PRQ), inverse PRQ (IPRQ), and pseudo PRQ (PPRQ). Results: A total 51.1% of all questions were of a higher taxonomical order (PRQ and IPRQ) with a significant decrease in the percentage of these questions (p < 0.001) from 2006 (61.5%) to 2012 (41.6%). The proportion of PRQs and IPRQs was significantly lower (p < 0.001) in internal medicine and surgery, compared to neurology and pediatrics. PRQs were mostly used in questions about diagnoses (71.7%). A significantly higher (p < 0.05) percentage of PR/therapy questions was found for internal medicine compared with neurology and pediatrics. Conclusion: The concept of pattern recognition is used with different priorities and to various extents by the different disciplines in a high stakes exam to test applied clinical knowledge. Being aware of this concept may aid in the design and balance of MCQs in an exam with respect to testing clinical reasoning as a desired skill at the threshold of postgraduate medical education.
引用
收藏
页数:6
相关论文
共 29 条
[1]   How we created virtual patient cases for primary care-based learning [J].
Adams, E. C. ;
Rodgers, C. J. ;
Harrington, R. ;
Young, M. D. B. ;
Sieber, V. K. .
MEDICAL TEACHER, 2011, 33 (04) :273-278
[2]   A practical discussion to avoid common pitfalls when constructing multiple choice questions items [J].
Al-Faris, Eiad A. ;
Alorainy, Ibrahim A. ;
Abdel-Hameed, Ahmad A. ;
Al-Rukban, Mohammed O. .
JOURNAL OF FAMILY AND COMMUNITY MEDICINE, 2010, 17 (02) :96-102
[3]   Verbosity, window dressing, and red herrings: Do they make a better test item? [J].
Case, SM ;
Swanson, DB ;
Becker, DF .
ACADEMIC MEDICINE, 1996, 71 (10) :S28-S30
[4]   Diagnostic reasoning strategies and diagnostic success [J].
Coderre, S ;
Mandin, H ;
Harasym, PH ;
Fick, GH .
MEDICAL EDUCATION, 2003, 37 (08) :695-703
[5]   Twelve tips on setting up a new medical school [J].
Cookson, John .
MEDICAL TEACHER, 2013, 35 (09) :715-719
[6]  
Croskerry Pat, 2009, Healthc Q, V12 Spec No Patient, pe171
[7]   What every teacher needs to know about clinical reasoning [J].
Eva, KW .
MEDICAL EDUCATION, 2005, 39 (01) :98-106
[8]   Varying conceptions of competence: an analysis of how health sciences educators define competence [J].
Fernandez, Nicolas ;
Dory, Valerie ;
Ste-Marie, Louis-Georges ;
Chaput, Monique ;
Charlin, Bernard ;
Boucher, Andree .
MEDICAL EDUCATION, 2012, 46 (04) :357-365
[9]   Using automatic item generation to create multiple-choice test items [J].
Gierl, Mark J. ;
Lai, Hollis ;
Turner, Simon R. .
MEDICAL EDUCATION, 2012, 46 (08) :757-765
[10]  
Glaser R, 1984, AM PSYCHOL, V39, P193