Interrater Agreement in the Evaluation of Discrepant Imaging Findings With the Radpeer System

被引:55
作者
Bender, Leila C. [1 ]
Linnau, Ken F. [1 ]
Meier, Eric N. [2 ]
Anzai, Yoshimi [1 ]
Gunn, Martin L. [1 ]
机构
[1] Univ Washington, Dept Radiol, Seattle, WA 98104 USA
[2] Univ Washington, Dept Biostat, Seattle, WA 98104 USA
关键词
quality assurance; quality improvement; Radpeer; QUALITY IMPROVEMENT; RADIOLOGY QUALITY; PEER ASSESSMENTS; RELIABILITY;
D O I
10.2214/AJR.12.8972
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
OBJECTIVE. The Radpeer system is central to the quality assurance process in many radiology practices. Previous studies have shown poor agreement between physicians in the evaluation of their peers. The purpose of this study was to assess the reliability of the Radpeer scoring system. MATERIALS AND METHODS. A sample of 25 discrepant cases was extracted from our quality assurance database. Images were made anonymous; associated reports and identities of interpreting radiologists were removed. Indications for the studies and descriptions of the discrepancies were provided. Twenty-one subspecialist attending radiologists rated the cases using the Radpeer scoring system. Multirater kappa statistics were used to assess interrater agreement, both with the standard scoring system and with dichotomized scores to reflect the practice of further review for cases rated 3 and 4. Subgroup analyses were conducted to assess subspecialist evaluation of cases. RESULTS. Interrater agreement was slight to fair compared with that expected by chance. For the group of 21 raters, the kappa values were 0.11 (95% CI, 0.06-0.16) with the standard scoring system and 0.20 (95% CI, 0.13-0.27) with dichotomized scores. There was disagreement about whether a discrepancy had occurred in 20 cases. Subgroup analyses did not reveal significant differences in the degree of interrater agreement. CONCLUSION. The identification of discrepant interpretations is valuable for the education of individual radiologists and for larger-scale quality assurance and quality improvement efforts. Our results show that a ratings-based peer review system is unreliable and subjective for the evaluation of discrepant interpretations. Resources should be devoted to developing more robust and objective assessment procedures, particularly those with clear quality improvement goals.
引用
收藏
页码:1320 / 1327
页数:8
相关论文
共 27 条
  • [1] Quality Initiatives Key Performance Indicators for Measuring and Improving Radiology Department Performance
    Abujudeh, Hani H.
    Kaewlai, Rathachai
    Asfaw, Benjamin A.
    Thrall, James H.
    [J]. RADIOGRAPHICS, 2010, 30 (03) : 571 - 580
  • [2] [Anonymous], 1993, An introduction to the bootstrap
  • [3] Anonymous group peer review in surgery morbidity and mortality conference
    Bender, Leila C.
    Klingensmith, Mary E.
    Freeman, Bradley D.
    Chapman, William C.
    Dunagan, William Claiborne
    Gottlieb, Jonathan E.
    Hall, Bruce L.
    [J]. AMERICAN JOURNAL OF SURGERY, 2009, 198 (02) : 270 - 276
  • [4] Accuracy of diagnostic procedures: Has it improved over the past five decades?
    Berlin, Leonard
    [J]. AMERICAN JOURNAL OF ROENTGENOLOGY, 2007, 188 (05) : 1173 - 1178
  • [5] Borgstede James P, 2004, J Am Coll Radiol, V1, P59, DOI 10.1016/S1546-1440(03)00002-4
  • [6] Lessons Learned From Quality Assurance: Errors in the Diagnosis of Acute Cholecystitis on Ultrasound and CT
    Brook, Olga R.
    Kane, Robert A.
    Tyagi, Girish
    Siewert, Bettina
    Kruskal, Jonathan B.
    [J]. AMERICAN JOURNAL OF ROENTGENOLOGY, 2011, 196 (03) : 597 - 604
  • [7] QUALITY IMPROVEMENT IN DIAGNOSTIC-RADIOLOGY
    CASCADE, PN
    [J]. AMERICAN JOURNAL OF ROENTGENOLOGY, 1990, 154 (05) : 1117 - 1120
  • [8] Fundamentals of clinical research for radiologists - Reader agreement studies
    Crewson, PE
    [J]. AMERICAN JOURNAL OF ROENTGENOLOGY, 2005, 184 (05) : 1391 - 1397
  • [9] Donnelly Lane F, 2007, J Am Coll Radiol, V4, P699, DOI 10.1016/j.jacr.2007.05.007
  • [10] FLEISS JL, 1971, PSYCHOL BULL, V76, P378, DOI 10.1037/h0031619