The Sensitivity of a Scenario-Based Assessment of Written Argumentation to School Differences in Curriculum and Instruction

被引:0
作者
Paul Deane
Joshua Wilson
Mo Zhang
Chen Li
Peter van Rijn
Hongwen Guo
Amanda Roth
Eowyn Winchester
Theresa Richter
机构
[1] Educational Testing Service,
[2] University of Delaware,undefined
[3] ETS Global,undefined
来源
International Journal of Artificial Intelligence in Education | 2021年 / 31卷
关键词
Scenario-based assessment; SBA; Writing; Assessment; Automated writing evaluation; AWE; Natural language processing; NLP; Automated essay scoring; AES; Writing process; Keystroke log; Argumentation; Interim assessment; Formative assessment;
D O I
暂无
中图分类号
学科分类号
摘要
Educators need actionable information about student progress during the school year. This paper explores an approach to this problem in the writing domain that combines three measurement approaches intended for use in interim-assessment fashion: scenario-based assessments (SBAs), to simulate authentic classroom tasks, automated writing evaluation (AWE) features to track changes in performance and writing process traits derived from a keystroke log. Our primary goal is to determine if SBAs designed to measure English Language Arts skills, supplemented by richer measurement of the writing task, function well as interim assessments that are sensitive to differences in performance related to differences in quality of instruction. We calibrated these measures psychometrically using data from a prior study and then applied them to evaluate changes in performance in one suburban and two urban middle schools that taught argument writing. Of the three schools, only School A (the suburban school, with the strongest overall performance) showed significant score increases on an essay task, accompanied by distinctive patterns of improvement. A general, unconditioned growth pattern was also evident. These results demonstrate an approach that can provide richer, more actionable information about student status and changes in student performance over the course of the school year.
引用
收藏
页码:57 / 98
页数:41
相关论文
共 78 条
[1]  
Attali Y(2009)Validity of scores for a developmental writing scale based on automated scoring Educational and Psychological Measurement 69 978-993
[2]  
Powers D(2007)High-stakes testing and curricular control: A qualitative metasynthesis Educational Researcher 36 258-267
[3]  
Au W(2010)Cognitively based assessment of, for, and as learning (CBAL): A preliminary theory of action for summative and formative assessment Measurement: Interdisciplinary Research and Perspectives 8 70-91
[4]  
Bennett RE(2016)From cognitive-domain theory to assessment practice Educational Psychologist 51 1-26
[5]  
Bennett RE(1986)Spoken and written textual dimension in English: Resolving the contradictory findings Language 62 394-414
[6]  
Deane P(1998)Assessment and classroom learning Assessment in Education: Principles, Policy & Practice 5 7-74
[7]  
van Rijn PW(2003)Finding the WRITE stuff: Automatic identification of discourse structure in student essays IEEE Intelligent Systems 18 32-39
[8]  
Biber D(2010)Formative assessment techniques to support student motivation and achievement The clearing house: A journal of educational strategies, issues and ideas 83 1-6
[9]  
Black P(2010)The utility of article and preposition error correction systems for English language learners: Feedback and assessment Language Testing 27 419-436
[10]  
Wiliam D(2014)Using writing process and product features to assess writing quality and explore how those features relate to other literacy tasks (ETS RR-14-03) ETS Research Report Series 2014 1-23