Standing on the shoulders of giants: Online formative assessments as the foundation for predictive learning analytics models

被引:29
作者
Bulut, Okan [1 ]
Gorgun, Guher [2 ]
Yildirim-Erbasli, Seyma N. [2 ]
Wongvorachan, Tarid [2 ]
Daniels, Lia M. [1 ]
Gao, Yizhu [2 ]
Lai, Ka Wing [2 ]
Shin, Jinnie [3 ]
机构
[1] Univ Alberta, Ctr Res Appl Measurement & Evaluat, Edmonton, AB, Canada
[2] Univ Alberta, Dept Educ Psychol, Edmonton, AB, Canada
[3] Univ Florida, Coll Educ, Gainesville, FL USA
关键词
formative assessment; learning analytics; learning management system; log data; predictive modelling; STUDENTS; DESIGN; PARTICIPATION; ACHIEVEMENT; PERFORMANCE; ENGAGEMENT; FRAMEWORK; PATTERNS; BEHAVIOR;
D O I
10.1111/bjet.13276
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
As universities around the world have begun to use learning management systems (LMSs), more learning data have become available to gain deeper insights into students' learning processes and make data-driven decisions to improve student learning. With the availability of rich data extracted from the LMS, researchers have turned much of their attention to learning analytics (LA) applications using educational data mining techniques. Numerous LA models have been proposed to predict student achievement in university courses. To design predictive LA models, researchers often follow a data-driven approach that prioritizes prediction accuracy while sacrificing theoretical links to learning theory and its pedagogical implications. In this study, we argue that instead of complex variables (e.g., event logs, clickstream data, timestamps of learning activities), data extracted from online formative assessments should be the starting point for building predictive LA models. Using the LMS data from multiple offerings of an asynchronous undergraduate course, we analysed the utility of online formative assessments in predicting students' final course performance. Our findings showed that the features extracted from online formative assessments (e.g., completion, timestamps and scores) served as strong and significant predictors of students' final course performance. Scores from online formative assessments were consistently the strongest predictor of student performance across the three sections of the course. The number of clicks in the LMS and the time difference between first access and due dates of formative assessments were also significant predictors. Overall, our findings emphasize the need for online formative assessments to build predictive LA models informed by theory and learning design. Practitioner notes What is already known about this topic Higher education institutions often use learning analytics for the early identification of low-performing students or students at risk of dropping out. Most predictive models in learning analytics rely on immutable student characteristics (e.g., gender, race and socioeconomic status) and complex variables extracted from log data within a learning management system. Prioritizing prediction accuracy without theory orientation often yields "black-box" models that fail to inform educators on what remedies need to be taken to improve student learning. What this paper adds Predictive models in learning analytics should consider learning theory, pedagogy and learning design to identify key predictors of student learning. Online formative assessments can be a starting point for building predictive models that are not only accurate but also provide educators with actionable insights on how student learning can be improved. Time-related and score-related features extracted from online formative assessments are particularly useful for predicting students' course performance. Implications for practice and/or policy This study provides strong evidence for using online formative assessments as the foundation for predictive models in learning analytics. Student data from online formative assessments can help educators provide students with feedback while informing future formative assessment cycles. Higher education institutions should avoid the hype around complex data from learning management systems and instead rely on effective learning tools such as online formative assessments to revolutionize the use of learning analytics.
引用
收藏
页码:19 / 39
页数:21
相关论文
共 92 条
[1]  
Aljohani Naif Radi, 2013, 2013 Seventh International Conference on Next-Generation Mobile Apps, Services and Technologies (NGMAST), P262, DOI 10.1109/NGMAST.2013.54
[2]  
Andrade H.L., 2010, HDB FORMATIVE ASSESS, P90, DOI [10.4324/9780203874851, DOI 10.4324/9780203874851]
[3]   Does regular online testing enhance student learning in the numerical sciences? Robust evidence from a large data set [J].
Angus, Simon D. ;
Watson, Judith .
BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY, 2009, 40 (02) :255-272
[4]  
[Anonymous], 2017, HDB LEARNING ANAL 1
[5]   Attrition and performance of community college transfers [J].
Aulck, Lovenoor ;
West, Jevin .
PLOS ONE, 2017, 12 (04)
[6]   Blended Learning: The impact of blackboard formative assessment on the final marks and students' perception of its effectiveness [J].
Baig, Mukhtiar ;
Gazzaz, Zohair Jamil ;
Farouq, Mohammed .
PAKISTAN JOURNAL OF MEDICAL SCIENCES, 2020, 36 (03) :327-332
[7]   LEARNING ANALYTICS TO IMPROVE FORMATIVE ASSESSMENT STRATEGIES [J].
Barana, Alice ;
Conte, Alberto ;
Fissore, Cecilia ;
Marchisio, Marina ;
Rabellino, Sergio .
JOURNAL OF E-LEARNING AND KNOWLEDGE SOCIETY, 2019, 15 (03) :75-88
[8]  
Black P., 1998, Assessment in Education, V5, P7, DOI [DOI 10.1080/0969595980050102, 10.1080/0969595980050102]
[9]   Programming Pluralism: Using Learning Analytics to Detect Patterns in the Learning of Computer Programming [J].
Blikstein, Paulo ;
Worsley, Marcelo ;
Piech, Chris ;
Sahami, Mehran ;
Cooper, Steven ;
Koller, Daphne .
JOURNAL OF THE LEARNING SCIENCES, 2014, 23 (04) :561-599
[10]   Synergies of Learning Analytics and Learning Design: A Systematic Review of Student Outcomes [J].
Blumenstein, Marion .
JOURNAL OF LEARNING ANALYTICS, 2020, 7 (03) :13-+