Diversified Concept Attention Method for Knowledge Tracing

被引:0
作者
Wu, Hao [1 ]
Cai, Yuekang [1 ]
机构
[1] Zhejiang Gongshang Univ, Comp & Informat Engn, Hangzhou 310018, Peoples R China
来源
COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II | 2022年 / 1492卷
关键词
Personalized tutoring; Knowledge tracing; Diversified concept; Attention layer of emotional consciousness; Attention mechanism;
D O I
10.1007/978-981-19-4549-6_32
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Through the learner's historical learning to study the knowledge state over time, knowledge tracing can predict the learner's future learning performance. It is an important issue in personalized tutoring. Although there are many related works on knowledge tracing, the existing methods still have some problems. For example, the tracing exercises are limited to a single historical exercise, the concept to be tested is limited to a single knowledge, and the semantic information on multiple concepts is rarely explored. To address this issue, a Diversified Concept Attention model for Knowledge Tracing (DCAKT) is proposed in this paper. Our method makes up for the shortcoming of the single concept of tracing, which applies natural language processing technology to get multiple concepts from the attention layer of emotional consciousness over time. Then we use an attention mechanism that evaluates the correlation between the exercises to be tested and historical concepts. This article uses the real datasets of the ASSISTments intelligent tutoring platform and a college engineering course on statics to assess the performance of DCAKT. In addition, our method can independently learn meaningful exercise sequences containing correct concepts.
引用
收藏
页码:418 / 430
页数:13
相关论文
共 14 条
[1]   Key factors that improve knowledge-intensive business processes which lead to competitive advantage [J].
Aureli, Selena ;
Giampaoli, Daniele ;
Ciambotti, Massimo ;
Bontis, Nick .
BUSINESS PROCESS MANAGEMENT JOURNAL, 2019, 25 (01) :126-143
[2]   Dynamic Convolution: Attention over Convolution Kernels [J].
Chen, Yinpeng ;
Dai, Xiyang ;
Liu, Mengchen ;
Chen, Dongdong ;
Yuan, Lu ;
Liu, Zicheng .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, :11027-11036
[3]   A Survey on Network Embedding [J].
Cui, Peng ;
Wang, Xiao ;
Pei, Jian ;
Zhu, Wenwu .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (05) :833-852
[4]   Context-Aware Attentive Knowledge Tracing [J].
Ghosh, Aritra ;
Heffernan, Neil ;
Lan, Andrew S. .
KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, :2330-2339
[5]  
Grefenstette E, 2015, ADV NEUR IN, V28
[6]   Diversified Semantic Attention Model for Fine-Grained Entity Typing [J].
Hu, Yanfeng ;
Qiao, Xue ;
Xing, Luo ;
Peng, Chen .
IEEE ACCESS, 2021, 9 (09) :2251-2265
[7]   Balance Preferences with Performance in Group Role Assignment [J].
Liu, Dongning ;
Yuan, Yunyi ;
Zhu, Haibin ;
Teng, Shaohua ;
Huang, Changqin .
IEEE TRANSACTIONS ON CYBERNETICS, 2018, 48 (06) :1800-1813
[8]   Recommender System for Predicting Student Performance [J].
Nguyen Thai-Nghe ;
Drumond, Lucas ;
Krohn-Grimberghe, Artus ;
Schmidt-Thieme, Lars .
PROCEEDINGS OF THE 1ST WORKSHOP ON RECOMMENDER SYSTEMS FOR TECHNOLOGY ENHANCED LEARNING (RECSYSTEL 2010), 2010, 1 (02) :2811-2819
[9]  
Piech C, 2015, ADV NEUR IN, V28
[10]  
Santoro A, 2016, PR MACH LEARN RES, V48