Measuring agreement of administrative data with chart data using prevalence unadjusted and adjusted kappa

被引:117
作者
Chen, Guanmin [1 ,4 ]
Faris, Peter [2 ]
Hemmelgarn, Brenda [1 ,3 ]
Walker, Robin L. [1 ]
Quan, Hude [1 ,4 ]
机构
[1] Univ Calgary, Dept Community Hlth Sci, Calgary, AB, Canada
[2] Alberta Bone & Joint Hlth Inst, Calgary, AB, Canada
[3] Univ Calgary, Dept Med, Calgary, AB, Canada
[4] Univ Calgary, Ctr Hlth & Policy Studies, Calgary, AB T2N 4N1, Canada
来源
BMC MEDICAL RESEARCH METHODOLOGY | 2009年 / 9卷
基金
加拿大健康研究院;
关键词
CLASSIFICATION-OF-DISEASES; INTERRATER AGREEMENT; NEOSPORA-CANINUM; WEIGHTED KAPPA; PARADOXES; RELIABILITY; BIAS; STROKE;
D O I
10.1186/1471-2288-9-5
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Background: Kappa is commonly used when assessing the agreement of conditions with reference standard, but has been criticized for being highly dependent on the prevalence. To overcome this limitation, a prevalence-adjusted and bias-adjusted kappa (PABAK) has been developed. The purpose of this study is to demonstrate the performance of Kappa and PABAK, and assess the agreement between hospital discharge administrative data and chart review data conditions. Methods: The agreement was compared for random sampling, restricted sampling by conditions, and case-control sampling from the four teaching hospitals in Alberta, Canada from ICD10 administrative data during January 1, 2003 and June 30, 2003. A total of 4,008 hospital discharge records and chart view, linked for personal unique identifier and admission date, for 32 conditions of random sampling were analyzed. The restricted sample for hypertension, myocardial infarction and congestive heart failure, and case-control sample for those three conditions were extracted from random sample. The prevalence, kappa, PABAK, positive agreement, negative agreement for the condition was compared for each of three samples. Results: The prevalence of each condition was highly dependent on the sampling method, and this variation in prevalence had a significant effect on both kappa and PABAK. PABAK values were obviously high for certain conditions with low kappa values. The gap between these two statistical values for the same condition narrowed as the prevalence of the condition approached 50%. Conclusion: Kappa values varied more widely than PABAK values across the 32 conditions. PABAK values should usually not be interpreted as measuring the same agreement as kappa in administrative data, particular for the condition with low prevalence. There is no single statistic measuring agreement that captures the desired information for validity of administrative data. Researchers should report kappa, the prevalence, positive agreement, negative agreement, and the relative frequency in each cell (i.e. a, b, c and d) to enable the reader to judge the validity of administrative data from multiple aspects.
引用
收藏
页数:8
相关论文
共 33 条
  • [1] RAKING KAPPA - DESCRIBING POTENTIAL IMPACT OF MARGINAL DISTRIBUTIONS ON MEASURES OF AGREEMENT
    AGRESTI, A
    GHOSH, A
    BINI, M
    [J]. BIOMETRICAL JOURNAL, 1995, 37 (07) : 811 - 820
  • [2] [Anonymous], BIOMETRICS
  • [3] Measurement of interrater agreement with adjustment for covariates
    Barlow, W
    [J]. BIOMETRICS, 1996, 52 (02) : 695 - 702
  • [4] Bernstein CN, 1999, AM J EPIDEMIOL, V149, P916, DOI 10.1093/oxfordjournals.aje.a009735
  • [5] 2X2 KAPPA-COEFFICIENTS - MEASURES OF AGREEMENT OR ASSOCIATION
    BLOCH, DA
    KRAEMER, HC
    [J]. BIOMETRICS, 1989, 45 (01) : 269 - 287
  • [6] BIAS, PREVALENCE AND KAPPA
    BYRT, T
    BISHOP, J
    CARLIN, JB
    [J]. JOURNAL OF CLINICAL EPIDEMIOLOGY, 1993, 46 (05) : 423 - 429
  • [7] Reliability of the knee examination in osteoarthritis - Effect of standardization
    Cibere, J
    Bellamy, N
    Thorne, A
    Esdaile, JM
    McGorm, KJ
    Chalmers, A
    Huang, S
    Peloso, P
    Shojania, K
    Singer, J
    Wong, H
    Kopec, J
    [J]. ARTHRITIS AND RHEUMATISM, 2004, 50 (02): : 458 - 468
  • [8] HIGH AGREEMENT BUT LOW KAPPA .2. RESOLVING THE PARADOXES
    CICCHETTI, DV
    FEINSTEIN, AR
    [J]. JOURNAL OF CLINICAL EPIDEMIOLOGY, 1990, 43 (06) : 551 - 558
  • [9] *CIHI, 2007, CAN COD STAND ICD 10