The IDEA Assessment Tool: Assessing the Reporting, Diagnostic Reasoning, and Decision-Making Skills Demonstrated in Medical Students' Hospital Admission Notes

被引:29
作者
Baker, Elizabeth A. [1 ]
Ledford, Cynthia H. [2 ,3 ]
Fogg, Louis [4 ,5 ]
Way, David P. [6 ]
Park, Yoon Soo [7 ]
机构
[1] Rush Univ, Dept Internal Med, Chicago, IL 60612 USA
[2] Ohio State Univ, Coll Med, Dept Internal Med, Columbus, OH 43210 USA
[3] Ohio State Univ, Coll Med, Dept Pediat, Columbus, OH 43210 USA
[4] Rush Univ, Dept Psychol, Chicago, IL 60612 USA
[5] Rush Univ, Coll Nursing, Chicago, IL 60612 USA
[6] Ohio State Univ, Coll Med, Off Evaluat Curricular Res & Dev, Columbus, OH 43210 USA
[7] Univ Illinois, Dept Med Educ, Chicago, IL USA
关键词
medical student; clinical documentation review; assessment; clinical reasoning; CASE PRESENTATIONS; WRITE-UPS; PERFORMANCE; VOCABULARY; JUSTIFICATION; FEASIBILITY; RELIABILITY; COMPETENCE; CLERKSHIPS; KNOWLEDGE;
D O I
10.1080/10401334.2015.1011654
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Construct: Clinical skills are used in the care of patients, including reporting, diagnostic reasoning, and decision-making skills. Written comprehensive new patient admission notes (H&Ps) are a ubiquitous part of student education but are underutilized in the assessment of clinical skills. The interpretive summary, differential diagnosis, explanation of reasoning, and alternatives (IDEA) assessment tool was developed to assess students' clinical skills using written comprehensive new patient admission notes. Background: The validity evidence for assessment of clinical skills using clinical documentation following authentic patient encounters has not been well documented. Diagnostic justification tools and postencounter notes are described in the literature(1,2) but are based on standardized patient encounters. To our knowledge, the IDEA assessment tool is the first published tool that uses medical students' H&Ps to rate students' clinical skills. Approach: The IDEA assessment tool is a 15-item instrument that asks evaluators to rate students' reporting, diagnostic reasoning, and decision-making skills based on medical students' new patient admission notes. This study presents validity evidence in support of the IDEA assessment tool using Messick's unified framework, including content (theoretical framework), response process (interrater reliability), internal structure (factor analysis and internal-consistency reliability), and relationship to other variables. Results: Validity evidence is based on results from four studies conducted between 2010 and 2013. First, the factor analysis (2010, n = 216) yielded a three-factor solution, measuring patient story, IDEA, and completeness, with reliabilities of .79, .88, and .79, respectively. Second, an initial interrater reliability study (2010) involving two raters demonstrated fair to moderate consensus (kappa = .21-.56, rho =.42-.79). Third, a second interrater reliability study (2011) with 22 trained raters also demonstrated fair to moderate agreement (intraclass correlations [ICCs] = .29-.67). There was moderate reliability for all three skill domains, including reporting skills (ICC = .53), diagnostic reasoning skills (ICC = .64), and decision-making skills (ICC = .63). Fourth, there was a significant correlation between IDEA rating scores (2010-2013) and final Internal Medicine clerkship grades (r = .24), 95% confidence interval (CI) [.15, .33]. Conclusions: The IDEA assessment tool is a novel tool with validity evidence to support its use in the assessment of students' reporting, diagnostic reasoning, and decision-making skills. The moderate reliability achieved supports formative or lower stakes summative uses rather than high-stakes summative judgments.
引用
收藏
页码:163 / 173
页数:11
相关论文
共 24 条
[1]  
Axelson R.D., 2009, Assessment in Health Professions Education, P57
[2]  
Baker EA, 2005, J GEN INTERN MED, V20, P157
[3]  
Baker EA, 2003, J GEN INTERN MED, V18, P235
[4]   Can diagnostic semantic competence be assessed from the medical record? [J].
Baker, EA ;
Connell, KJ ;
Bordage, G ;
Sinacore, J .
ACADEMIC MEDICINE, 1999, 74 (10) :S13-S15
[5]   The feasibility and acceptability of implementing formal evaluation sessions and using descriptive vocabulary to assess student performance on a clinical clerkship [J].
Battistone, MJ ;
Milne, C ;
Sande, MA ;
Pangaro, LN ;
Hemmer, PA ;
Shomaker, TS .
TEACHING AND LEARNING IN MEDICINE, 2002, 14 (01) :5-10
[6]   SEMANTIC STRUCTURES AND DIAGNOSTIC THINKING OF EXPERTS AND NOVICES [J].
BORDAGE, G ;
LEMIEUX, M .
ACADEMIC MEDICINE, 1991, 66 (09) :S70-S72
[7]   Assessing the semantic content of clinical case presentations: Studies of reliability and concurrent validity [J].
Bordage, G ;
Connell, KJ ;
Chang, RW ;
Gecht, MR ;
Sinacore, JM .
ACADEMIC MEDICINE, 1997, 72 (10) :S37-S39
[8]   ELABORATED KNOWLEDGE - A KEY TO SUCCESSFUL DIAGNOSTIC THINKING [J].
BORDAGE, G .
ACADEMIC MEDICINE, 1994, 69 (11) :883-885
[9]   The importance of early problem representation during case presentations [J].
Chang, RW ;
Bordage, G ;
Connell, KJ .
ACADEMIC MEDICINE, 1998, 73 (10) :S109-S111
[10]   Biomedical knowledge, clinical cognition and diagnostic justification: a structural equation model [J].
Cianciolo, Anna T. ;
Williams, Reed G. ;
Klamen, Debra L. ;
Roberts, Nicole K. .
MEDICAL EDUCATION, 2013, 47 (03) :309-316