MAN: Memory-augmented Attentive Networks for Deep Learning-based Knowledge Tracing

被引:3
作者
He, Liangliang [1 ]
Li, Xiao [2 ]
Wang, Pancheng [1 ]
Tang, Jintao [1 ]
Wang, Ting [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, Yanwachi St, Changsha 410073, Peoples R China
[2] Natl Univ Def Technol, Informat Ctr, Yanwachi St, Changsha 410073, Peoples R China
基金
中国国家自然科学基金;
关键词
E-learning; knowledge tracing; deep learning; multi-head attention mechanism; memory-augmented neural network;
D O I
10.1145/3589340
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge Tracing (KT) is the task of modeling a learner's knowledge state to predict future performance in e-learning systems based on past performance. Deep learning-based methods, such as recurrent neural networks, memory-augmented neural networks, and attention-based neural networks, have recently been used in KT. Such methods have demonstrated excellent performance in capturing the latent dependencies of a learner's knowledge state on recent exercises. However, these methods have limitations when it comes to dealing with the so-called Skill Switching Phenomenon (SSP), i.e., when learners respond to exercises in an e-learning system, the latent skills in the exercises typically switch irregularly. SSP will deteriorate the performance of deep learning-based approaches for simulating the learner's knowledge state during skill switching, particularly when the association between the switching skills and the previously learned skills is weak. To address this problem, we propose the Memory-augmented Attentive Network (MAN), which combines the advantages of memory-augmented neural networks and attention-based neural networks. Specifically, in MAN, memory-augmented neural networks are used to model learners' longer term memory knowledge, while attention-based neural networks are used to model learners' recent term knowledge. In addition, we design a context-aware attention mechanism that automatically weighs the tradeoff between these two types of knowledge. With extensive experiments on several e-learning datasets, we show that MAN effectively improve predictive accuracies of existing state-of-the-art DLKT methods.
引用
收藏
页数:22
相关论文
共 35 条
[21]  
Piech C, 2015, Arxiv, DOI arXiv:1506.05908
[22]  
Pu YJ, 2018, IEEE INT CONF BIG DA, P3245, DOI 10.1109/BigData.2018.8622355
[23]  
Pu Yanjun, 2019, P INT C ED DAT MIN S
[24]  
Rollinson Joseph, 2015, P INT C ED DAT MIN S
[25]  
Santoro A, 2016, PR MACH LEARN RES, V48
[26]  
Shen SH, 2024, Arxiv, DOI arXiv:2105.15106
[27]   SAINT plus : Integrating Temporal Features for EdNet Correctness Prediction [J].
Shin, Dongmin ;
Shim, Yugeun ;
Yu, Hangyeol ;
Lee, Seewoo ;
Kim, Byungsoo ;
Choi, Youngduck .
LAK21 CONFERENCE PROCEEDINGS: THE ELEVENTH INTERNATIONAL CONFERENCE ON LEARNING ANALYTICS & KNOWLEDGE, 2021, :490-496
[28]   JKT: A joint graph convolutional network based Deep Knowledge Tracing [J].
Song, Xiangyu ;
Li, Jianxin ;
Tang, Yifu ;
Zhao, Taige ;
Chen, Yunliang ;
Guan, Ziyu .
INFORMATION SCIENCES, 2021, 580 :510-523
[29]  
Sukhbaatar S, 2015, ADV NEUR IN, V28
[30]  
Tu ZP, 2016, Arxiv, DOI arXiv:1601.04811