A snapshot of current Objective Structured Clinical Examination (OSCE) practice at Australian medical schools

被引:13
作者
Heal, Clare [1 ]
D'Souza, Karen [4 ]
Banks, Jennifer [1 ]
Malau-Aduli, Bunmi S. [2 ,3 ]
Turner, Richard [5 ]
Smith, Jane [6 ]
Bray, Ebony [1 ]
Shires, Lizzi [5 ]
Wilson, Ian [7 ]
机构
[1] James Cook Univ, Coll Med & Dent, Mackay, Qld, Australia
[2] James Cook Univ, Coll Med & Dent, Med Educ, Mackay, Qld, Australia
[3] James Cook Univ, Coll Med & Dent, Assessment & Evaluat, Mackay, Qld, Australia
[4] Deakin Univ, Sch Med, Clin Curriculum & Assessment, Geelong, Vic, Australia
[5] Univ Tasmania, Sch Med, Hobart, Tas, Australia
[6] Bond Univ, Sch Med, Gold Coast, Australia
[7] Univ Wollongong, Grad Sch Med, Wollongong, NSW, Australia
关键词
COMPETENCE; ASSESSMENTS; EXAMINERS; EDUCATION; QUALITY; RATINGS; SKILLS;
D O I
10.1080/0142159X.2018.1487547
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Objective: The objective of this collaborative study was to compare current practices of conducting high-stake, exit-level Objective Structured Clinical Examinations (OSCEs) at all Australian medical schools. We aimed to document similarities and differences between schools, and compare existing practice against available gold standard, evidence-based practice. We also aimed to identify areas where gold standards do not currently exist, and could be developed in the future. Methods: A 72-item semi-structured questionnaire was sent to all 19 Australian medical schools with graduating students. Results: A total of 18/19 schools responded. Of these, 16/18 schools had summative exit-level OSCEs representing content from multiple medical specialties. The total number of OSCE stations varied from 8 to 16, with total OSCE testing time ranging from 70 to 160 min. All schools blueprinted their OSCE to their curriculum, and trained simulated patients and examiners. There was variation in the format of marking rubric used. Conclusions: This study has provided insight into the current OSCE practices of the majority of medical schools in Australia. Whilst the comparative data reveal a wide variation in OSCE practices between schools, many recommended "gold standard" OSCE practices are implemented. The collective awareness of our similarities and differences provides us with a baseline platform, as well as an impetus for iterative quality improvement. Such discourse also serves to develop new gold standards in practice where none have previously existed.
引用
收藏
页码:441 / 447
页数:7
相关论文
共 33 条
[1]  
[Anonymous], 2012, STAND ASS ACCR PRIM
[2]  
[Anonymous], MED GRADUATE COMPETE
[3]  
[Anonymous], AUSTR CURR FRAM JUN
[4]  
[Anonymous], DEV FRAMEWORK COMPET
[5]  
[Anonymous], DEV NATL ASSESSMENT
[6]   Structured assessments of clinical competence [J].
Boursicot, Katharine A. M. .
BRITISH JOURNAL OF HOSPITAL MEDICINE, 2010, 71 (06) :342-344
[7]   Validity: on the meaningful interpretation of assessment data [J].
Downing, SM .
MEDICAL EDUCATION, 2003, 37 (09) :830-837
[8]   Medical education - Assessment in medical education [J].
Epstein, Ronald M. .
NEW ENGLAND JOURNAL OF MEDICINE, 2007, 356 (04) :387-396
[9]   Undergraduate medical education in the Gulf Cooperation Council: A multi-countries study (Part 2) [J].
Hamdy, H. ;
Telmesani, A. W. ;
Al Wardy, N. ;
Abdel-Khalek, N. ;
Carruthers, G. ;
Hassan, F. ;
Kassab, S. ;
Abu-Hijleh, M. ;
Al-Roomi, K. ;
O'Malley, K. ;
Ahmed, M. G. El Din ;
Raj, G. A. ;
Rao, G. M. ;
Sheikh, J. .
MEDICAL TEACHER, 2010, 32 (04) :290-295
[10]  
Harden R. M., 2016, The definitive guide to the OSCE: The objective structured clinical examination as a performance assessment