Inter-rater reliability of the STEP protocol

被引:6
作者
Edman, Å [1 ]
Mahnfeldt, M
Wallin, A
机构
[1] Univ Gothenburg, Sahlgrenska Univ Hosp, Inst Clin Neurosci, Psychiat Sect, SE-43180 Molndal, Sweden
[2] Varberg Hosp, Dept Psychiat, Varberg, Sweden
关键词
D O I
10.1177/089198870101400307
中图分类号
R592 [老年病学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 100203 ;
摘要
An inter-rater reliability test of the Stepwise Comparative Status Analysis (STEP) is presented. The STEP is a protocol for the clinical examination of patients with dementia, within the scope of a neuropsychiatric investigation. It combines psychiatric and neurologic bedside examination methods. The analysis is made in three steps where primary, observable symptom variables are successively aggregated via compound variables to the final determination of one of seven possible dominant regional brain syndromes (global, frontal, subcortical, parietal, frontosubcortical, frontoparietal, other), here also called complex variables. In the present study, two senior physicians assessed 50 patients independently and simultaneously. None of the patients was known to both physicians. In 42 patients (84%), the same dominant brain syndrome was determined by the two clinicians. The probability (P value) of this (or better) agreement was calculated at 2.0 x 10(-12). Kappa coefficients were calculated as a measure of assessment agreement regarding the 50 STEP variables. For 20 variables, the coefficient was 0.75 or above, indicating excellent agreement; for 22 variables, the coefficient was below 0.75 and above 0.40, indicating moderate agreement; and for 4 variables, the value was 0.40 or below, indicating poor agreement. Kappa calculations regarding the assessments of four variables were either not possible or were considered inappropriate.
引用
收藏
页码:140 / 144
页数:5
相关论文
共 50 条
  • [31] Inter-rater reliability of the GNRBO® knee arthrometer
    Vauhnik, Renata
    Morrissey, Matthew C.
    Perme, Maja Pohar
    Sevsek, France
    Rugelj, Darja
    KNEE, 2014, 21 (02) : 541 - 543
  • [32] The inter-rater reliability of the story retell procedure
    Hula, WD
    McNeil, MR
    Doyle, PJ
    Rubinsky, HJ
    Fossett, TRD
    APHASIOLOGY, 2003, 17 (05) : 523 - 528
  • [33] The inter-rater reliability of mental capacity assessments
    Raymont, Vanessa
    Buchanan, Alec
    David, Anthony S.
    Hayward, Peter
    Wessely, Simon
    Hotopf, Matthew
    INTERNATIONAL JOURNAL OF LAW AND PSYCHIATRY, 2007, 30 (02) : 112 - 117
  • [34] Cheiloscopy: Lip Print Inter-rater Reliability
    Furnari, Winnie
    Janal, Malvin N.
    JOURNAL OF FORENSIC SCIENCES, 2017, 62 (03) : 782 - 785
  • [35] Inter-rater reliability of primitive signs in dementia
    Plutino, Andrea
    Baldinelli, Sara
    Fiori, Chiara
    Ranaldi, Valentina
    Silvestrini, Mauro
    Luzzi, Simona
    CLINICAL NEUROLOGY AND NEUROSURGERY, 2019, 187
  • [36] Inter-rater reliability of WHO 'disability' grading
    Brandsma, W
    Larsen, M
    Richard, C
    Ebenezer, M
    LEPROSY REVIEW, 2004, 75 (02) : 131 - 134
  • [37] Diagnosing delusions: A review of inter-rater reliability
    Bell, Vaughan
    Halligan, Peter W.
    Ellis, Hadyn D.
    SCHIZOPHRENIA RESEARCH, 2006, 86 (1-3) : 76 - 79
  • [38] AN INTER-RATER RELIABILITY STUDY OF THE BARTHEL INDEX
    ROY, CW
    TOGNERI, J
    HAY, E
    PENTLAND, B
    INTERNATIONAL JOURNAL OF REHABILITATION RESEARCH, 1988, 11 (01) : 67 - 70
  • [39] Inter-rater reliability of the Rome criteria in children
    Saps, M
    Di Lorenzo, C
    GASTROENTEROLOGY, 2004, 126 (04) : A378 - A378
  • [40] SIMPLE DEVICE FOR IMPROVING INTER-RATER RELIABILITY
    MCQUEEN, WM
    BEHAVIOR THERAPY, 1975, 6 (01) : 128 - 129