共 20 条
Interobserver and Intraobserver Agreement are Unsatisfactory When Determining Abstract Study Design and Level of Evidence
被引:1
作者:
Patel, Neeraj M.
[1
]
Schmitz, Matthew R.
[2
]
Bastrom, Tracey P.
[4
]
Ghag, Ravi
[11
]
Janicki, Joseph A.
[1
]
Kushare, Indranil, V
[3
]
Lewis, Ronald
[5
]
Mistovich, Ronald Justin
[6
]
Nelson, Susan E.
[7
]
Sawyer, Jeffrey R.
[8
]
Vanderhave, Kelly L.
[9
]
Wallace, Maegen J.
[10
]
McKay, Scott D.
[3
]
机构:
[1] Ann & Robert H Lurie Childrens Hosp, Chicago, IL USA
[2] San Antonio Mil Med Ctr, San Antonio, TX USA
[3] Texas Childrens Hosp, Houston, TX 77030 USA
[4] Rady Childrens Hosp, San Diego, CA USA
[5] Pediat Orthopaed Charleston, Summerville, SC USA
[6] Rainbow Babies & Childrens Hosp, 2101 Adelbert Rd, Cleveland, OH 44106 USA
[7] Univ Rochester, Med Ctr, Dept Orthopaed, Rochester, NY USA
[8] Le Bonheur Childrens Hosp, Memphis, TN USA
[9] Carolinas Med Ctr, Charlotte, NC 28203 USA
[10] Univ Nebraska Med Ctr, Omaha, NE USA
[11] British Columbia Childrens Hosp, Vancouver, BC, Canada
关键词:
study design;
level of evidence;
reliability;
CONFUSION;
D O I:
10.1097/BPO.0000000000002136
中图分类号:
R826.8 [整形外科学];
R782.2 [口腔颌面部整形外科学];
R726.2 [小儿整形外科学];
R62 [整形外科学(修复外科学)];
学科分类号:
摘要:
Background: Understanding differences between types of study design (SD) and level of evidence (LOE) are important when selecting research for presentation or publication and determining its potential clinical impact. The purpose of this study was to evaluate interobserver and intraobserver reliability when assigning LOE and SD as well as quantify the impact of a commonly used reference aid on these assessments. Methods: Thirty-six accepted abstracts from the Pediatric Orthopaedic Society of North America (POSNA) 2021 annual meeting were selected for this study. Thirteen reviewers from the POSNA Evidence-Based Practice Committee were asked to determine LOE and SD for each abstract, first without any assistance or resources. Four weeks later, abstracts were reviewed again with the guidance of the Journal of Bone and Joint Surgery (JBJS) LOE chart, which is adapted from the Oxford Centre for Evidence-Based Medicine. Interobserver and intraobserver reliability were calculated using Fleiss' kappa statistic (k). chi(2) analysis was used to compare the rate of SD-LOE mismatch between the first and second round of reviews. Results: Interobserver reliability for LOE improved slightly from fair (k = 0.28) to moderate (k = 0.43) with use of the JBJS chart. There was better agreement with increasing LOE, with the most frequent disagreement between levels 3 and 4. Interobserver reliability for SD was fair for both rounds 1 (k = 0.29) and 2 (k = 0.37). Similar to LOE, there was better agreement with stronger SD. Intraobserver reliability was widely variable for both LOE and SD (k = 0.10 to 0.92 for both). When matching a selected SD to its associated LOE, the overall rate of correct concordance was 82% in round 1 and 92% in round 2 (P < 0.001). Conclusion: Interobserver reliability for LOE and SD was fair to moderate at best, even among experienced reviewers. Use of the JBJS/Oxford chart mildly improved agreement on LOE and resulted in less SD-LOE mismatch, but did not affect agreement on SD.
引用
收藏
页码:E696 / E700
页数:5
相关论文