MAN: Memory-augmented Attentive Networks for Deep Learning-based Knowledge Tracing

被引:3
作者
He, Liangliang [1 ]
Li, Xiao [2 ]
Wang, Pancheng [1 ]
Tang, Jintao [1 ]
Wang, Ting [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, Yanwachi St, Changsha 410073, Peoples R China
[2] Natl Univ Def Technol, Informat Ctr, Yanwachi St, Changsha 410073, Peoples R China
基金
中国国家自然科学基金;
关键词
E-learning; knowledge tracing; deep learning; multi-head attention mechanism; memory-augmented neural network;
D O I
10.1145/3589340
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge Tracing (KT) is the task of modeling a learner's knowledge state to predict future performance in e-learning systems based on past performance. Deep learning-based methods, such as recurrent neural networks, memory-augmented neural networks, and attention-based neural networks, have recently been used in KT. Such methods have demonstrated excellent performance in capturing the latent dependencies of a learner's knowledge state on recent exercises. However, these methods have limitations when it comes to dealing with the so-called Skill Switching Phenomenon (SSP), i.e., when learners respond to exercises in an e-learning system, the latent skills in the exercises typically switch irregularly. SSP will deteriorate the performance of deep learning-based approaches for simulating the learner's knowledge state during skill switching, particularly when the association between the switching skills and the previously learned skills is weak. To address this problem, we propose the Memory-augmented Attentive Network (MAN), which combines the advantages of memory-augmented neural networks and attention-based neural networks. Specifically, in MAN, memory-augmented neural networks are used to model learners' longer term memory knowledge, while attention-based neural networks are used to model learners' recent term knowledge. In addition, we design a context-aware attention mechanism that automatically weighs the tradeoff between these two types of knowledge. With extensive experiments on several e-learning datasets, we show that MAN effectively improve predictive accuracies of existing state-of-the-art DLKT methods.
引用
收藏
页数:22
相关论文
共 35 条
[1]  
Cen H, 2006, LECT NOTES COMPUT SC, V4053, P164
[2]  
CORBETT AT, 1994, USER MODEL USER-ADAP, V4, P253, DOI 10.1007/BF01099821
[3]   Online Knowledge Level Tracking with Data-Driven Student Models and Collaborative Filtering [J].
Cully, Antoine ;
Demiris, Yiannis .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (10) :2000-2013
[4]   Context-Aware Attentive Knowledge Tracing [J].
Ghosh, Aritra ;
Heffernan, Neil ;
Lan, Andrew S. .
KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, :2330-2339
[5]   Hybrid computing using a neural network with dynamic external memory [J].
Graves, Alex ;
Wayne, Greg ;
Eynolds, Malcolm R. ;
Harley, Tim ;
Danihelka, Ivo ;
Grabska-Barwinska, Agnieszka ;
Colmenarejo, Sergio Gomez ;
Grefenstette, Edward ;
Amalho, Tiago R. ;
Agapiou, John ;
Badia, Adria Puigdomenech ;
Hermann, Karl Moritz ;
Zwols, Yori ;
Strovski, Georg O. ;
Ain, Adam C. ;
King, Helen ;
Summerfield, Christopher ;
Lunsom, Phil B. ;
Kavukcuoglu, Koray ;
Hassabis, Demis .
NATURE, 2016, 538 (7626) :471-+
[6]  
Wilson KH, 2016, Arxiv, DOI arXiv:1604.02336
[7]  
hai-Nghe N., 2012, ED RECOMMENDER SYSTE, P129, DOI DOI 10.4018/978-1-61350-489-5.CH006
[8]   Multi-type factors representation learning for deep learning-based knowledge tracing [J].
He, Liangliang ;
Tang, Jintao ;
Li, Xiao ;
Wang, Pancheng ;
Chen, Feng ;
Wang, Ting .
WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2022, 25 (03) :1343-1372
[9]   ADKT: Adaptive Deep Knowledge Tracing [J].
He, Liangliang ;
Tang, Jintao ;
Li, Xiao ;
Wang, Ting .
WEB INFORMATION SYSTEMS ENGINEERING, WISE 2020, PT I, 2020, 12342 :302-314
[10]  
He Liangliang, 2021, P IEEE INT C MULT EX, P1