Assessment validity and learning analytics as prerequisites for ensuring student-centred learning design

被引:20
作者
Divjak, Blazenka [1 ]
Svetec, Barbi [1 ]
Horvat, Damir [1 ]
Kadoic, Nikola [1 ]
机构
[1] Univ Zagreb, Fac Org & Informat, Pavlinska 2, Varazhdin, Croatia
关键词
assessment; assessment utility; assessment validity; LD; learning analytics; learning outcomes; multi-criteria decision-making; PROFESSIONAL COMPETENCE; ASSESSMENT LITERACY; HIGHER-EDUCATION; STRATEGIES; FRAMEWORK; QUALITY;
D O I
10.1111/bjet.13290
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
To ensure the validity of an assessment programme, it is essential to align it with the intended learning outcomes (LO). We present a model for ensuring assessment validity which supports this constructive alignment and uses learning analytics (LA). The model is based on LA that include a comparison between ideal LO weights (expressing the prioritization of LOs), actual assessment weights (maximum assessment points per LO), and student assessment results (actually obtained assessment points per LO), as well as clustering and trace data analysis. These analytics are part of a continuous improvement cycle, including strategic planning and learning design (LD) supported by LO prioritization, and monitoring and evaluation supported by LA. To illustrate and test the model, we conducted a study on the example of a graduate-level higher education course in applied mathematics, by analysing student assessment results and activity in a learning management system. The study showed that the analyses provided valuable insights with practical implications for the development of sound LD, tailored educational interventions, databases of assessment tasks, recommendation systems, and self-regulated learning. Future research should investigate the possibilities for automation of such LA, to enable full exploitation of their potential and use in everyday teaching and learning. Practitioner notes What is already known about this topic To develop sound, student-centred learning design (LD), it is essential to ensure that assessment is constructively aligned with the intended learning outcomes (LO). This constructive alignment is crucial for ensuring the validity of an assessment program. Learning analytics (LA) can provide insights that help develop valid assessment programs. What this paper adds As not all LOs are equally important, assessment programs should reflect the prioritization of LOs, which can be determined by using various multi-criteria decision-making (MCDM) methods. This article presents and illustrates, based on an empirical case, a model of continuous improvement of LD, which uses LA to compare how LOs are reflected in (actual) students' results, in an (actual) assessment program, and in the (ideal) prioritization of LOs based on MCDM. The study presents how clustering of students based on their assessment results can be used in LA to provide insights for educational interventions better targeted to students' needs. Implications for practice and/or policy The proposed LA can provide important insights for the development (or improvement) of LD in line with the intended course LOs, but also study program LOs (if course and study program LOs are properly aligned). The LA can also contribute to the development of databases of assessment tasks aligned with course LOs, with ensured validity, supporting sharing and reusing, as well as to the development of tailored educational interventions (eg, based on clustering). The proposed LA can also contribute to the development of recommendation systems, with recommendations for the improvement of LD for teachers or learning suggestions for students, as well as students' meta-cognition and self-regulated learning.
引用
收藏
页码:313 / 334
页数:22
相关论文
共 61 条
[1]  
American Educational Research Association American Psychological Association & National Council on Measurement in Education, 2014, Standards for educational and psychological testing
[2]  
Anderson L. W., 2001, A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives
[3]  
[Anonymous], 2009, COMMUNIQU C EUROPEAN
[4]  
[Anonymous], 1982, Out of the Crisis
[5]  
[Anonymous], 2012, HIGH EDUC RES DEV, DOI DOI 10.1080/07294360.2012.642839
[6]  
Bloom BS., 1956, Handbook I: the cognitive domain
[7]   What are the risks of Virtual Reality data? Learning Analytics, Algorithmic Bias and a Fantasy of Perfect Data [J].
Carter, Marcus ;
Egliston, Ben .
NEW MEDIA & SOCIETY, 2023, 25 (03) :485-504
[8]   How to Design Effective Learning Analytics Indicators? A Human-Centered Design Approach [J].
Chatti, Mohamed Amine ;
Muslim, Arham ;
Guesmi, Mouadh ;
Richtscheid, Florian ;
Nasimi, Dawood ;
Shahin, Amin ;
Damera, Ritesh .
ADDRESSING GLOBAL CHALLENGES AND QUALITY EDUCATION, EC-TEL 2020, 2020, 12315 :303-317
[9]   Critical Issues in Designing and Implementing Temporal Analytics [J].
Chen, Bodong ;
Knight, Simon ;
Wise, Alyssa .
JOURNAL OF LEARNING ANALYTICS, 2018, 5 (01) :1-9
[10]   Good Proctor or “Big Brother”? Ethics of Online Exam Supervision Technologies [J].
Coghlan S. ;
Miller T. ;
Paterson J. .
Philosophy & Technology, 2021, 34 (4) :1581-1606