Inter-rater reliability of hand motor function assessment in Parkinson's disease: Impact of clinician training

被引:1
|
作者
Kenny, Lorna [1 ]
Azizi, Zahra [1 ]
Moore, Kevin [1 ]
Alcock, Megan [2 ]
Heywood, Sarah [2 ]
Jonsson, Agnes [2 ]
McGrath, Keith [2 ]
Foley, Mary J. [3 ]
Sweeney, Brian [4 ]
O'Sullivan, Sean [4 ]
Barton, John [5 ]
Tedesco, Salvatore [5 ]
Sica, Marco [5 ]
Crowe, Colum [5 ]
Timmons, Suzanne [1 ]
机构
[1] Univ Coll Cork, Ctr Gerontol & Rehabil, Sch Med, Cork T12 XH60, Ireland
[2] Mercy Univ Hosp, Cork T12 WE28, Ireland
[3] Cork Stroke Support Ctr, Cork T12 AKA4, Ireland
[4] Bon Secours Hosp, Neurol Dept, Cork T12 DV56, Ireland
[5] Univ Coll Cork, Tyndall Natl Inst, Cork T12 AV22, Ireland
来源
CLINICAL PARKINSONISM & RELATED DISORDERS | 2024年 / 11卷
关键词
Parkinson's disease; Assessment; Inter-rater reliability; Variability; MDS-UPDRS; RATING-SCALE;
D O I
10.1016/j.prdoa.2024.100278
中图分类号
R74 [神经病学与精神病学];
学科分类号
摘要
Medication adjustments in Parkinson's disease (PD) are driven by patient subjective report and clinicians' rating of motor feature severity (such as bradykinesia and tremor). Objective: As patients may be seen by different clinicians at different visits, this study aims to determine the interrater reliability of upper limb motor function assessment among clinicians treating people with PD (PwPD). Methods: PwPD performed six standardised hand movements from the Movement Disorder Society's Unified Parkinson's Disease Rating Scale (MDS-UPDRS), while two cameras simultaneously recorded. Eight clinicians independently rated tremor and bradykinesia severity using a visual analogue scale. We compared intraclass correlation coefficient (ICC) before and after a training/calibration session where high-variance participant videos were reviewed and MDS-UPDRS instructions discussed. Results: In the first round, poor agreement was observed for most hand movements, with best agreement for resting tremor (ICC 0.66 bilaterally; right hand 95 % CI 0.50-0.82; left hand: 0.50-0.81). Postural tremor (left hand) had poor agreement (ICC 0.14; 95% CI 0.04-0.33), as did wrist pronation-supination (right hand ICC 0.34; 95 % CI 0.19-0.56). In post-training rating exercises, agreements improved, especially for the right hand. Best agreement was observed for hand open-close ratings in the left hand (ICC 0.82, 95 % CI 0.64-0.94) and resting tremor in the right hand (ICC 0.92, 95 % CI 0.83-0.98). Discrimination between right and left hand features by raters also improved, except in resting tremor (disimprovement) and wrist pronation-supination (no change). Conclusions: Clinicians vary in rating video-recorded PD upper limb motor features, especially bradykinesia, but this can be improved somewhat with training.
引用
收藏
页数:7
相关论文
共 50 条
  • [31] Assessment of the Human Factors Analysis and Classification System (HFACS): Intra-rater and inter-rater reliability
    Ergai, Awatef
    Cohen, Tara
    Sharp, Julia
    Wiegmann, Doug
    Gramopadhye, Anand
    Shappell, Scott
    SAFETY SCIENCE, 2016, 82 : 393 - 398
  • [32] Improving inter-rater reliability of the enhancing assessment of common therapeutic factors (ENACT) measure through training of raters
    Mwenge, Mwamba M.
    Figge, Caleb J.
    Metz, Kristina
    Kane, Jeremy C.
    Kohrt, Brandon A.
    Pedersen, Gloria A.
    Sikazwe, Izukanji
    Van Wyk, Stephanie Skavenski
    Mulemba, Saphira M.
    Murray, Laura K.
    JOURNAL OF PUBLIC HEALTH IN AFRICA, 2022, 13 (03) : 1 - 12
  • [33] Novice Inter-Rater Reliability on the Selective Functional Movement Assessment (SFMA) After a 4-Hour Training Session
    Harper, Brent
    Aron, Adrian
    INTERNATIONAL JOURNAL OF SPORTS PHYSICAL THERAPY, 2023, 18 (04): : 940 - 948
  • [34] The Inter-Rater and Intra-Rater Reliability of the Actual Aquatic Skills Test (AAST) for Assessing Young Children's Motor Competence in the Water
    Mertens, Lisa
    De Martelaer, Kristine
    Saeaekslahti, Arja
    D'Hondt, Eva
    INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH, 2022, 19 (01)
  • [35] An assessment of the inter-rater reliability of the ASA physical status score in the orthopaedic trauma population
    Ihejirika, Rivka C.
    Thakore, Rachel V.
    Sathiyakumar, Vasanth
    Ehrenfeld, Jesse M.
    Obremskey, William T.
    Sethi, Manish K.
    INJURY-INTERNATIONAL JOURNAL OF THE CARE OF THE INJURED, 2015, 46 (04): : 542 - 546
  • [36] Evaluating Inter-Rater Reliability and Statistical Power of Vegetation Measures Assessing Deer Impact
    Begley-Miller, Danielle R.
    Diefenbach, Duane R.
    McDill, Marc E.
    Rosenberry, Christopher S.
    Just, Emily H.
    FORESTS, 2018, 9 (11)
  • [37] Qualitative visual assessment of the J-sign demonstrates high inter-rater reliability
    Walla, Nicholas
    Moore, Toren
    Harangody, Sarah
    Fitzpatrick, Sean
    Flanigan, David C.
    Duerr, Robert A.
    Siston, Robert
    Magnussen, Robert A.
    JOURNAL OF ISAKOS JOINT DISORDERS & ORTHOPAEDIC SPORTS MEDICINE, 2023, 8 (06) : 420 - 424
  • [38] Inter-rater reliability of PATH observations for assessment of ergonomic risk factors in hospital work
    Park, Jung-Keun
    Boyer, Jon
    Tessler, Jamie
    Casey, Jeffrey
    Schemm, Linda
    Gore, Rebecca
    Punnett, Laura
    ERGONOMICS, 2009, 52 (07) : 820 - 829
  • [39] Inter-rater reliability of the German version of the Nurses' Global Assessment of Suicide Risk scale
    Kozel, Bernd
    Grieser, Manuela
    Abderhalden, Christoph
    Cutcliffe, John R.
    INTERNATIONAL JOURNAL OF MENTAL HEALTH NURSING, 2016, 25 (05) : 409 - 417
  • [40] A video anchored rating scale leads to high inter-rater reliability of inexperienced and expert raters in the absence of rater training
    Patnaik, Ronit
    Anton, Nicholas E.
    Stefanidis, Dimitrios
    AMERICAN JOURNAL OF SURGERY, 2020, 219 (02) : 221 - 226