Inter-rater reliability of the STEP protocol

被引:6
|
作者
Edman, Å [1 ]
Mahnfeldt, M
Wallin, A
机构
[1] Univ Gothenburg, Sahlgrenska Univ Hosp, Inst Clin Neurosci, Psychiat Sect, SE-43180 Molndal, Sweden
[2] Varberg Hosp, Dept Psychiat, Varberg, Sweden
关键词
D O I
10.1177/089198870101400307
中图分类号
R592 [老年病学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 100203 ;
摘要
An inter-rater reliability test of the Stepwise Comparative Status Analysis (STEP) is presented. The STEP is a protocol for the clinical examination of patients with dementia, within the scope of a neuropsychiatric investigation. It combines psychiatric and neurologic bedside examination methods. The analysis is made in three steps where primary, observable symptom variables are successively aggregated via compound variables to the final determination of one of seven possible dominant regional brain syndromes (global, frontal, subcortical, parietal, frontosubcortical, frontoparietal, other), here also called complex variables. In the present study, two senior physicians assessed 50 patients independently and simultaneously. None of the patients was known to both physicians. In 42 patients (84%), the same dominant brain syndrome was determined by the two clinicians. The probability (P value) of this (or better) agreement was calculated at 2.0 x 10(-12). Kappa coefficients were calculated as a measure of assessment agreement regarding the 50 STEP variables. For 20 variables, the coefficient was 0.75 or above, indicating excellent agreement; for 22 variables, the coefficient was below 0.75 and above 0.40, indicating moderate agreement; and for 4 variables, the value was 0.40 or below, indicating poor agreement. Kappa calculations regarding the assessments of four variables were either not possible or were considered inappropriate.
引用
收藏
页码:140 / 144
页数:5
相关论文
共 50 条
  • [1] Testing the Reliability of Inter-Rater Reliability
    Eagan, Brendan
    Brohinsky, Jais
    Wang, Jingyi
    Shaffer, David Williamson
    LAK20: THE TENTH INTERNATIONAL CONFERENCE ON LEARNING ANALYTICS & KNOWLEDGE, 2020, : 454 - 461
  • [2] OWAS inter-rater reliability
    Lins, Christian
    Fudickar, Sebastian
    Hein, Andreas
    APPLIED ERGONOMICS, 2021, 93
  • [3] ASSESSMENT OF INTER-RATER RELIABILITY
    BEARD, K
    STEWART, DA
    AGE AND AGEING, 1989, 18 (05) : 354 - 354
  • [4] Inter-rater reliability of the Illinois Structured Decision Support Protocol
    Kang, HA
    Poertner, J
    CHILD ABUSE & NEGLECT, 2006, 30 (06) : 679 - 689
  • [5] Inter-rater Reliability Testing of the Safety Protocol for Thirst Management
    do Nascimento, Leonel A.
    Fonseca, Ligia F.
    dos Santos, Claudia B.
    JOURNAL OF PERIANESTHESIA NURSING, 2018, 33 (04) : 527 - 536
  • [6] INTER-RATER RELIABILITY FROM THE VIEWPOINT OF THE RATER
    SORENSON, AG
    GROSS, CF
    PERSONNEL AND GUIDANCE JOURNAL, 1957, 35 (06): : 365 - 368
  • [7] Comparison between Inter-rater Reliability and Inter-rater Agreement in Performance Assessment
    Liao, Shih Chieh
    Hunt, Elizabeth A.
    Chen, Walter
    ANNALS ACADEMY OF MEDICINE SINGAPORE, 2010, 39 (08) : 613 - 618
  • [8] Inter-rater and Intra-rater Reliability of the Videofluoroscopic Dysphagia Scale with the Standardized Protocol
    Min, Ingi
    Woo, Hyeonseong
    Kim, Jae Yoon
    Kim, Tae-Lim
    Lee, Yookyung
    Chang, Won Kee
    Jung, Se Hee
    Lee, Woo Hyung
    Oh, Byung-Mo
    Han, Tai Ryoon
    Seo, Han Gil
    DYSPHAGIA, 2024, 39 (01) : 43 - 51
  • [9] Inter-rater and Intra-rater Reliability of the Videofluoroscopic Dysphagia Scale with the Standardized Protocol
    Ingi Min
    Hyeonseong Woo
    Jae Yoon Kim
    Tae-Lim Kim
    Yookyung Lee
    Won Kee Chang
    Se Hee Jung
    Woo Hyung Lee
    Byung-Mo Oh
    Tai Ryoon Han
    Han Gil Seo
    Dysphagia, 2024, 39 : 43 - 51
  • [10] Inter-rater Reliability of the HEART Score
    Gershon, Colin A.
    Yagapen, Annick N.
    Lin, Amber
    Yanez, David
    Sun, Benjamin C.
    ACADEMIC EMERGENCY MEDICINE, 2019, 26 (05) : 552 - 555