Instruments for evaluating education in evidence-based practice - A systematic review

被引:282
作者
Shaneyfelt, Terrence
Baum, Karyn D.
Bell, Douglas
Feldstein, David
Houston, Thomas K.
Kaatz, Scott
Whelan, Chad
Green, Michael
机构
[1] Univ Alabama Birmingham, Sch Med, Vet Affairs Med Ctr, Dept Med, Birmingham, AL 35233 USA
[2] Dept Vet Affairs Med Ctr, Birmingham, AL USA
[3] Univ Minnesota, Sch Med, Dept Med, Minneapolis, MN 55455 USA
[4] Univ Calif Los Angeles, David Geffen Sch Med, Dept Med, Div Gen Internal Med, Los Angeles, CA USA
[5] Univ Wisconsin, Sch Med & Publ Hlth, Dept Med, Madison, WI USA
[6] Henry Ford Hosp, Detroit, MI 48202 USA
[7] Univ Chicago, Dept Med, Chicago, IL 60637 USA
[8] Yale Univ, Sch Med, Dept Med, New Haven, CT 06510 USA
来源
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION | 2006年 / 296卷 / 09期
关键词
D O I
10.1001/jama.296.9.1116
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Context Evidence-based practice (EBP) is the integration of the best research evidence with patients' values and clinical circumstances in clinical decision making. Teaching of EBP should be evaluated and guided by evidence of its own effectiveness. Objective To appraise, summarize, and describe currently available EBP teaching evaluation instruments. Data Sources and Study Selection We searched the MEDLINE, EMBASE, CINAHL, HAPI, and ERIC databases; reference lists of retrieved articles; EBP Internet sites; and 8 education journals from 1980 through April 2006. For inclusion, studies had to report an instrument evaluating EBP, contain sufficient description to permit analysis, and present quantitative results of administering the instrument. Data Extraction Two raters independently abstracted information on the development, format, learner levels, evaluation domains, feasibility, reliability, and validity of the EBP evaluation instruments from each article. We defined 3 levels of instruments based on the type, extent, methods, and results of psychometric testing and suitability for different evaluation purposes. Data Synthesis Of 347 articles identified, 115 were included, representing 104 unique instruments. The instruments were most commonly administered to medical students and postgraduate trainees and evaluated EBP skills. Among EBP skills, acquiring evidence and appraising evidence were most commonly evaluated, but newer instruments evaluated asking answerable questions and applying evidence to individual patients. Most behavior instruments measured the performance of EBP steps in practice but newer instruments documented the performance of evidence-based clinical maneuvers or patient-level outcomes. At least 1 type of validity evidence was demonstrated for 53% of instruments, but 3 or more types of validity evidence were established for only 10%. High-quality instruments were identified for evaluating the EBP competence of individual trainees, determining the effectiveness of EBP curricula, and assessing EBP behaviors with objective outcome measures. Conclusions Instruments with reasonable validity are available for evaluating some domains of EBP and may be targeted to different evaluation needs. Further development and testing is required to evaluate EBP attitudes, behaviors, and more recently articulated EBP skills.
引用
收藏
页码:1116 / 1127
页数:12
相关论文
共 50 条