Integrating fine-grained attention into multi-task learning for knowledge tracing

被引:0
|
作者
Liangliang He
Xiao Li
Pancheng Wang
Jintao Tang
Ting Wang
机构
[1] National University of Defense Technology,College of Computer
[2] National University of Defense Technology,Information center
来源
World Wide Web | 2023年 / 26卷
关键词
Knowledge tracing; Deep learning; Representation learning; Learner modeling;
D O I
暂无
中图分类号
学科分类号
摘要
Knowledge Tracing (KT) refers to the task of modeling learners’ various knowledge state given their past performance in e-learning platforms. Existing KT models usually only leverage the response (correct or incorrect) feedback generated by learners in the process of exercise-making, thus making them inaccurate and imprecise for capturing the knowledge growth after each exercise-making. Some researchers try to jointly learn hint-taking and response predictions with multi-task learning, but only achieve a limited improvement due to the imprecision of related task’s feedback and the rigid fusion of multi-task features. This paper proposes Multi-task Attentive Knowledge Tracing (MAKT) that jointly learns hint-taking and attempt-making predictions simultaneously with response prediction. Two specific models in MAKT are proposed, including a Bi-task Attentive Knowledge Tracing model (BAKT) and a Tri-task Attentive Knowledge Tracing model (TAKT). BAKT jointly learns a single related task with response prediction by considering two fine-grained attention mechanisms: imbalance-aware attention mechanism and skill-aware attention mechanism. The former is designed to address the inherent problem of imbalanced exercise samples in KT. The latter realizes skill individualization in both stages of multi-task features fusion and multi-model features fusion. TAKT jointly learns two related tasks simultaneously with response prediction based on the skill-aware attention mechanism, which has the potential to be extended by integrating more related tasks. Experiments on several real-world benchmark datasets show that MAKT outperforms state-of-the-art KT methods on predicting future learner responses, which also indicates a bright outlook for combining KT with multi-task learning.
引用
收藏
页码:3347 / 3372
页数:25
相关论文
共 50 条
  • [1] Integrating fine-grained attention into multi-task learning for knowledge tracing
    He, Liangliang
    Li, Xiao
    Wang, Pancheng
    Tang, Jintao
    Wang, Ting
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 3347 - 3372
  • [2] Response speed enhanced fine-grained knowledge tracing: A multi-task learning perspective
    Huang, Tao
    Hu, Shengze
    Yang, Huali
    Geng, Jing
    Li, Zhifei
    Xu, Zhuoran
    Ou, Xinjia
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [3] ATTENTION-BASED MULTI-TASK LEARNING FOR FINE-GRAINED IMAGE CLASSIFICATION
    Liu, Dichao
    Wang, Yu
    Mase, Kenji
    Kato, Jien
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 1499 - 1503
  • [4] ADAPTIVE MULTI-TASK LEARNING FOR FINE-GRAINED CATEGORIZATION
    Sun, Gang
    Chen, Yanyun
    Liu, Xuehui
    Wu, Enhua
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 996 - 1000
  • [5] Fine-grained relation extraction with focal multi-task learning
    Zhang, Xinsong
    Liu, Tianyi
    Jia, Weijia
    Li, Pengshuai
    SCIENCE CHINA-INFORMATION SCIENCES, 2020, 63 (06)
  • [6] Fine-grained relation extraction with focal multi-task learning
    Xinsong Zhang
    Tianyi Liu
    Weijia Jia
    Pengshuai Li
    Science China Information Sciences, 2020, 63
  • [7] Fine-grained relation extraction with focal multi-task learning
    Xinsong ZHANG
    Tianyi LIU
    Weijia JIA
    Pengshuai LI
    ScienceChina(InformationSciences), 2020, 63 (06) : 229 - 231
  • [8] Multi-Task Multi-Head Attention Memory Network for Fine-Grained Sentiment Analysis
    Dai, Zehui
    Dai, Wei
    Liu, Zhenhua
    Rao, Fengyun
    Chen, Huajie
    Zhang, Guangpeng
    Ding, Yadong
    Liu, Jiyang
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 609 - 620
  • [9] Attribute hierarchy based multi-task learning for fine-grained image classification
    Zhao, Junjie
    Peng, Yuxin
    He, Xiangteng
    NEUROCOMPUTING, 2020, 395 : 150 - 159
  • [10] A Viewpoint Aware Multi-Task Learning Framework for Fine-Grained Vehicle Recognition
    Chen, Qianqiu
    Liu, Wei
    Yu, Xiaoxia
    IEEE ACCESS, 2020, 8 : 171912 - 171923