Meta-analysis of Cohen's kappa

被引:164
作者
Sun S. [1 ]
机构
[1] School of Education, University of Cincinnati, Cincinnati, OH 45221
关键词
Cohen's κ; Generalizability; Inter-rater reliability; Meta-analysis;
D O I
10.1007/s10742-011-0077-3
中图分类号
学科分类号
摘要
Cohen's κ is the most important and most widely accepted measure of inter-rater reliability when the outcome of interest is measured on a nominal scale. The estimates of Cohen's κ usually vary from one study to another due to differences in study settings, test properties, rater characteristics and subject characteristics. This study proposes a formal statistical framework for meta-analysis of Cohen's κ to describe the typical inter-rater reliability estimate across multiple studies, to quantify between-study variation and to evaluate the contribution of moderators to heterogeneity. To demonstrate the application of the proposed statistical framework, a meta-analysis of Cohen's κ is conducted for pressure ulcer classification systems. Implications and directions for future research are discussed. © 2011 Springer Science+Business Media, LLC.
引用
收藏
页码:145 / 163
页数:18
相关论文
共 79 条
[1]  
Allman R.M., Pressure ulcer prevalence, incidence, risk factors, and impact, Clinics in Geriatric Medicine, 13, 3, pp. 421-436, (1997)
[2]  
Altman D.G., Practical Statistics for Medical Students, (1991)
[3]  
Baugh F., Correcting effect sizes for score reliability: A reminder that measurement and substantive issues are linked inextricably, Educational and Psychological Measurement, 62, 2, pp. 254-263, (2002)
[4]  
Banerjee M., Capozzoli M., McSweeny L., Sinha D., Beyond kappa: A review of interrater agreement measures, Can. J. Stat., 27, pp. 3-23, (1999)
[5]  
Berry K.J., Mielke P.W., A generalization of Cohen's kappa agreement measure to interval measurement and multiple raters, Educ. Psychol. Meas., 48, pp. 921-933, (1998)
[6]  
Blackman N.J.-M., Koval J.J., Interval estimation for Cohen's kappa as a measure of agreement, Statistics in Medicine, 19, 5, pp. 723-741, (2000)
[7]  
Bloch D.A., Chmura Kraemer H., 2 x 2 kappa coefficients: Measures of agreement or association, Biometrics, 45, 1, pp. 269-287, (1989)
[8]  
Borenstein M., Software for publication bias, Publication Bias in Meta-Analysis-Prevention, Assessment and Adjustments, pp. 193-220, (2005)
[9]  
Bours G., Halfens R., Lubbers M., Haalboom J., The development of a National Registration Form to measure the prevalence of pressure ulcers in the Netherlands, Ostomy Wound Manage., 45, pp. 28-40, (1999)
[10]  
Brennan R.L., Prediger D.J., Coefficient kappa: Some uses, misuses, and alternatives, Educ. Psychol. Meas., 41, pp. 687-699, (1981)