A Modular Health-Related Quality of Life Instrument for Electronic Assessment and Treatment Monitoring: Web-Based Development and Psychometric Validation of Core Thrive Items

被引:1
作者
Wicks, Paul [1 ]
McCaffrey, Stacey [1 ]
Goodwin, Kim [1 ]
Black, Ryan [1 ]
Hoole, Michael [1 ]
Heywood, James [1 ]
机构
[1] PatientsLikeMe, 160 2nd St, Cambridge, MA 02142 USA
关键词
personal health records; health-related quality of life; patient reported outcome measures; PATIENT-REPORTED OUTCOMES; FUNCTIONAL RATING-SCALE; CONCEPTUAL-MODEL; ALSFRS-R; VALIDITY; PHQ-9; PLATFORM;
D O I
10.2196/12075
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Background: Patient-reported outcome (PRO) measures describe natural history, manage disease, and measure the effects of interventions in trials. Patients themselves increasingly use Web-based PRO tools to track their progress, share their data, and even self-experiment. However, existing PROs have limitations such as being: designed for paper (not screens), long and burdensome, negatively framed, under onerous licensing restrictions, either too generic or too specific. Objective: This study aimed to develop and validate the core items of a modular, patient-centric, PRO system (Thrive) that could measure health status across a range of chronic conditions with minimal burden. Methods: Thrive was developed in 4 phases, largely consistent with Food and Drug Administration guidance regarding PRO development. First, preliminary core items (common across multiple conditions: core Thrive items) were developed through literature review, analysis of approximately 20 existing PROs on PatientsLikeMe, and feedback from psychometric and content experts. Second, 2 rounds of cognitive interviews were iteratively conducted with patients (N=14) to obtain feedback on the preliminary items. Third, core Thrive items were administered electronically along with comparator measures, including 20-item Short-Form General Health Survey (SF)-20 and Patient Health Questionnaire (PHQ)-9, to a large sample (N=2002) of adults with chronic diseases through the PatientsLikeMe platform. On the basis of theoretical and empirical rationale, items were revised or removed. Fourth, the revised core Thrive items were administered to another sample of patients (N=704) with generic and condition-specific comparator measures. A psychometric evaluation, which included both modern and classical test theory approaches, was conducted on these items, and several more items were removed. Results: Cognitive interviews helped to remove confusing or redundant items. Empirical testing of subscales revealed good internal consistency (Cronbach alpha=.712-.879), test-retest reliability (absolute intraclass correlations=.749-.912), and convergent validity with legacy PRO scales (eg, Pearson r=.5-.75 between Thrive subscales and PHQ-9 total). The finalized instrument consists of a 19-item core including 5 multi-item subscales: Core symptoms, Abilities, Mobility, Sleep, and Thriving. Results provide evidence of construct (content, convergent) validity, high levels of test-retest and internal consistency reliability, and the ability to detect change over time. The items did not exhibit bias based on gender or age, and the items generally functioned similarly across conditions. These results support the use of Thrive Core items across diverse chronic patient populations. Conclusions: Thrive appears to be a useful approach for capturing important domains for patients with chronic conditions. This core set serves as a foundation to begin developing modular condition-specific versions in the near future. Cross-walking against traditional PROs from the PatientsLikeMe platform is underway, in addition to clinical validation and comparison with biomarkers. Thrive is licensed under Creative Commons Attribution ShareAlike 4.0.
引用
收藏
页数:18
相关论文
共 54 条
  • [1] Incorporating the patient's perspective into drug development and communication: An ad hoc task force report of the patient-reported outcomes (PRO) harmonization group meeting at the Food and Drug Administration, February 16, 2001
    Acquadro, C
    Berzon, R
    Dubois, D
    Leidy, NK
    Marquis, P
    Revicki, D
    Rothman, M
    [J]. VALUE IN HEALTH, 2003, 6 (05) : 522 - 531
  • [2] Andrich D., 1978, APPL PSYCH MEAS, V2, P581, DOI [10.1177/014662167800200413, DOI 10.1177/014662167800200413, DOI 10.1177/014662167800200413BAGNATO]
  • [3] [Anonymous], GUID IND PAT REP OUT
  • [4] Symptom Monitoring With Patient-Reported Outcomes During Routine Cancer Treatment: A Randomized Controlled Trial
    Basch, Ethan
    Deal, Allison M.
    Kris, Mark G.
    Scher, Howard I.
    Hudis, Clifford A.
    Sabbatini, Paul
    Rogak, Lauren
    Bennett, Antonia V.
    Dueck, Amylou C.
    Atkinson, Thomas M.
    Chou, Joanne F.
    Dulko, Dorothy
    Sit, Laura
    Barz, Allison
    Novotny, Paul
    Fruscione, Michael
    Sloan, Jeff A.
    Schrag, Deborah
    [J]. JOURNAL OF CLINICAL ONCOLOGY, 2016, 34 (06) : 557 - +
  • [5] Patient Engagement and the Design of Digital Health
    Birnbaum, Faith
    Lewis, Dana
    Rosen, Rochelle K.
    Ranney, Megan L.
    [J]. ACADEMIC EMERGENCY MEDICINE, 2015, 22 (06) : 754 - 756
  • [6] Bond T.G., 2015, APPL RASCH MODEL FUN
  • [7] Evaluation of an Online Platform for Multiple Sclerosis Research: Patient Description, Validation of Severity Scale, and Exploration of BMI Effects on Disease Course
    Bove, Riley
    Secor, Elizabeth
    Healy, Brian C.
    Musallam, Alexander
    Vaughan, Timothy
    Glanz, Bonnie I.
    Greeke, Emily
    Weiner, Howard L.
    Chitnis, Tanuja
    Wicks, Paul
    De Jager, Philip L.
    [J]. PLOS ONE, 2013, 8 (03):
  • [8] Qualitative research and content validity: developing best practices based on science and experience
    Brod, Meryl
    Tesler, Laura E.
    Christensen, Torsten L.
    [J]. QUALITY OF LIFE RESEARCH, 2009, 18 (09) : 1263 - 1278
  • [9] Byrom B, 2010, EPRO: ELECTRONIC SOLUTIONS FOR PATIENT-REPORTED DATA, P1
  • [10] The problem with health measurement
    Cano, Stefan J.
    Hobart, Jeremy C.
    [J]. PATIENT PREFERENCE AND ADHERENCE, 2011, 5 : 279 - 290