Evaluation Methods, Indicators, and Outcomes in Learning Health Systems: Protocol for a Jurisdictional Scan

被引:0
作者
Vanderhout, Shelley [1 ,2 ]
Bird, Marissa [1 ,2 ]
Giannarakos, Antonia [3 ]
Panesar, Balpreet [1 ,3 ]
Whitmore, Carly [4 ]
机构
[1] Trillium Hlth Partners, Inst Better Hlth, Mississauga, ON, Canada
[2] Univ Toronto, Inst Hlth Policy Management & Evaluat, Toronto, ON, Canada
[3] Trillium Hlth Partners, Lib & Knowledge Serv, Mississauga, ON, Canada
[4] McMaster Univ, Fac Hlth Sci, Sch Nursing, Hamilton, ON, Canada
来源
JMIR RESEARCH PROTOCOLS | 2024年 / 13卷
基金
加拿大健康研究院;
关键词
learning health systems; evaluation; jurisdictional scan; counterfactuals; LHS; health system; real-time evidence; informatics; organizational culture; learning cycles; benchmark; patient care; gaps; health care; inequities; development; implementation; intervention; new approach; CARE;
D O I
10.2196/57929
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Background: In learning health systems (LHSs), real-time evidence, informatics, patient-provider partnerships and experiences, and organizational culture are combined to conduct "learning cycles" that support improvements in care. Although the concept of LHSs is fairly well established in the literature, evaluation methods, mechanisms, and indicators are less consistently described. Furthermore, LHSs often use "usual care" or"status quo" as a benchmark for comparing new approaches to care, but disentangling usual carefrom multifarious care modalities found across settings is challenging. There is a need to identify which evaluation methods are used within LHSs, describe how LHS growth and maturity are conceptualized, and determine what toolsand measures are being used to evaluate LHSs at the system level. Objective: This study aimed to (1) identify international examples of LHSs and describe their evaluation approaches, frameworks, indicators, and outcomes; and (2) describe common characteristics, emphases, assumptions, or challenges in establishing counterfactuals in LHSs. Methods: A jurisdictional scan, which is a method used to explore, understand, and assess how problems have been framed by others in a given field, will beconducted according to modified PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. LHSs will be identified through a search of peer-reviewed and gray literature using Ovid MEDLINE, EBSCO CINAHL, Ovid Embase, Clarivate Web of Science, PubMed non-MEDLINE databases, and the web. We will describe evaluation approaches used both at the LHS learning cycle and system levels. To gain a comprehensive understanding of each LHS, including details specific to evaluation, self-identified LHSs will be included if they are described according to at least4 of 11 prespecified criteria (corefunctionalities, analytics, use of evidence, co-design or implementation, evaluation, change management or governance structures, data sharing, knowledge sharing, training or capacity building, equity, and sustainability). Search results will be screened, extracted, and analyzed to inform a descriptive review pertaining to our main objectives. Evaluation methods and approaches, both within learning cycles and at the system level, as well as frameworks, indicators, and target outcomes, will be identified and summarized descriptively. Across evaluations, common challenges, assumptions, contextual factors, and mechanisms will be described. Results: As of October 2024, the database searches described above yielded 3503 citations after duplicate removal. Full-text screening of 117 articles is complete, and 49 articles are under analysis. Results are expected in early 2025. Conclusions: This research will characterize the current landscape of LHS evaluation approaches and provide a foundation for developing consistent and scalable metrics of LHS growth, maturity, and success. This work will also serve to identify opportunities for improving the alignment of current evaluation approaches and metrics with population health needs, community priorities, equity, and health system strategic aims. Trial Registration: Open Science Framework b5u7e; https://osf.io/b5u7e International Registered Report Identifier (IRRID): DERR1-10.2196/57929
引用
收藏
页数:7
相关论文
共 22 条
  • [1] Reid RJ, Greene SM., Gathering speed and countering tensions in the rapid learning health system, Learn Health Syst, 7, 3, (2023)
  • [2] Menear M, Blanchette MA, Demers-Payette O, Roy D., A framework for value-creating learning health systems, Health Res Policy Syst, 17, 1, (2019)
  • [3] Somerville M, Cassidy C, Curran JA, Johnson C, Sinclair D, Elliott Rose A., Implementation strategies and outcome measures for advancing learning health systems: a mixed methods systematic review, Health Res Policy Syst, 21, 1, (2023)
  • [4] Allen C, Coleman K, Mettert K, Lewis C, Westbrook E, Lozano P., A roadmap to operationalize and evaluate impact in a learning health system, Learn Health Syst, 5, 4, (2021)
  • [5] Weir CJ, Heazell AEP, Whyte S, Norman JE., Evaluating improvement interventions using routine data to support a learning health system: research design, data access, analysis and reporting, BMJ Qual Saf, 29, 8, pp. 696-700, (2020)
  • [6] Budrionis A, Bellika JG., The learning healthcare system: where are we now? A systematic review, J Biomed Inform, 64, pp. 87-92, (2016)
  • [7] Friedman CP., What is unique about learning health systems?, Learn Health Syst, 6, 3, (2022)
  • [8] Kumpunen S, Edwards N, Georghiou T, Hughes G., Why do evaluations of integrated care not produce the results we expect?, International Journal of Care Coordination, 23, 1, pp. 9-13, (2020)
  • [9] Steele Gray C, Zonneveld N, Breton M, Wankah P, Shaw J, Anderson GM, Et al., Comparing international models of integrated care: how can we learn across borders?, Int J Integr Care, 20, 1, (2020)
  • [10] Skivington K, Matthews L, Simpson SA, Craig P, Baird J, Blazeby JM, Et al., A new framework for developing and evaluating complex interventions: update of medical research council guidance, BMJ, 374, (2021)