The objective structured public health examination: A study of reliability using multi-level analysis

被引:2
作者
Davison, Ian [1 ]
Cooper, Rob
Bullock, Alison [2 ]
机构
[1] Univ Birmingham, Ctr Res Med & Dent Educ, Birmingham B15 2TT, W Midlands, England
[2] Cardiff Univ, Cardiff, S Glam, Wales
关键词
CLINICAL EXAMINATIONS; SKILLS; PERFORMANCE; COMPETENCE; OSCE;
D O I
10.3109/01421590903514622
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Background: Introduced in 2006, the objective structured public health examination (OSPHE) is a part of the examination for the UK Membership of the Faculty of Public Health. Designed to simulate real work in public health, the same five generic competencies were assessed simultaneously in each of six stations in 1 h 48 min. This study estimates the examination reliability of the OSPHE. Methods: Data from all 198 candidates, 54 questions, 39 examiners over 9 test days in 2006 were analysed for reliability and sources of error using multilevel cross-classified analysis and generalisability theory. Results: A reliability co-efficient of 0.85 was estimated (95% confidence interval 0.82-0.88). Little variance was accounted for by questions or examiners. Twelve questions would increase reliability to 0.92. More than one examiner per question has negligible impact on reliability. Conclusion: Multilevel modelling estimates examination reliability which is more rigorous than Cronbach's alpha. Traditional generalisability methods based on the analysis of variance (ANOVA) are unable to do this. The reliability of this examination compares favourably with the literature despite having just six test stations and under 2 h per candidate. Other specialties could consider the use of parallel testing of the same key competencies at all stations within simulated workplace events.
引用
收藏
页码:582 / 585
页数:4
相关论文
共 23 条
[1]   How to set up an OSCE [J].
Boursicot, Katharine ;
Roberts, Trudie .
CLINICAL TEACHER, 2005, 2 (01) :16-20
[2]  
Collett E, 1998, SPECTROSCOPY, V13, P20
[3]   My current thoughts on coefficient alpha and successor procedures [J].
Cronbach, LJ .
EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2004, 64 (03) :391-418
[4]   Generalisability: a key to unlock professional assessment [J].
Crossley, J ;
Davies, H ;
Humphris, G ;
Jolly, B .
MEDICAL EDUCATION, 2002, 36 (10) :972-978
[5]  
DAVISON I, 2007, EVALUATION INTRO OBJ
[6]   Structured Assessment of Minor Surgical Skills (SAMSS) for family medicine residents [J].
Friedlich, M ;
MacRae, H ;
Oandasan, I ;
Tannenbaum, D ;
Batty, H ;
Reznick, R ;
Regehr, G .
ACADEMIC MEDICINE, 2001, 76 (12) :1241-1246
[7]   Optimising the reproducibility of a performance-based assessment test in midwifery education [J].
Govaerts, MJB ;
Van der Vleuten, CPM ;
Schuwirth, LWT .
ADVANCES IN HEALTH SCIENCES EDUCATION, 2002, 7 (02) :133-145
[8]   Communication skills in standardized-patient assessment of final-year medical students: A psychometric study [J].
Guiton, G ;
Hodgson, CS ;
Delandshere, G ;
Wilkerson, L .
ADVANCES IN HEALTH SCIENCES EDUCATION, 2004, 9 (03) :179-187
[9]   Comparison of a rational and an empirical standard setting procedure for an OSCE [J].
Kramer, A ;
Muijtjens, A ;
Jansen, K ;
Düsman, H ;
Tan, L ;
van der Vleuten, C .
MEDICAL EDUCATION, 2003, 37 (02) :132-139
[10]   THE ASSESSMENT OF CLINICAL SKILLS COMPETENCE PERFORMANCE [J].
MILLER, GE .
ACADEMIC MEDICINE, 1990, 65 (09) :S63-S67