Knowledge structure enhanced graph representation learning model for attentive knowledge tracing

被引:29
|
作者
Gan, Wenbin [1 ]
Sun, Yuan [1 ]
Sun, Yi [2 ]
机构
[1] Sokendai, Natl Inst Informat, Tokyo, Japan
[2] Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing, Peoples R China
基金
日本学术振兴会;
关键词
cognitive question difficulty; graph representation learning; intelligent tutoring systems; knowledge structure discovery; knowledge tracing; learner proficiency estimation;
D O I
10.1002/int.22763
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge tracing (KT) is a fundamental personalized-tutoring technique for learners in online learning systems. Recent KT methods employ flexible deep neural network-based models that excel at this task. However, the adequacy of KT is still challenged by the sparseness of the learners' exercise data. To alleviate the sparseness problem, most of the exiting KT studies are performed at the skill-level rather than the question-level, as questions are often numerous and associated with much fewer skills. However, at the skill level, KT neglects the distinctive information related to the questions themselves and their relations. In this case, the models can imprecisely infer the learners' knowledge states and might fail to capture the long-term dependencies in the exercising sequences. In the knowledge domain, skills are naturally linked as a graph (with the edges being the prerequisite relations between pedagogical concepts). We refer to such a graph as a knowledge structure (KS). Incorporating a KS into the KT procedure can potentially resolve both the sparseness and information loss, but this avenue has been underexplored because obtaining the complete KS of a domain is challenging and labor-intensive. In this paper, we propose a novel KS-enhanced graph representation learning model for KT with an attention mechanism (KSGKT). We first explore eight methods that automatically infer the domain KS from learner response data and integrate it into the KT procedure. Leveraging a graph representation learning model, we then obtain the question and skill embeddings from the KS-enhanced graph. To incorporate more distinctive information on the questions, we extract the cognitive question difficulty from the learning history of each learner. We then propose a convolutional representation method that fuses these disctinctive features, thus obtaining a comprehensive representation of each question. These representations are input to the proposed KT model, and the long-term dependencies are handled by the attention mechanism. The model finally predicts the learner's performance on new problems. Extensive experiments conducted from six perspectives on three real-world data sets demonstrated the superiority and interpretability of our model for learner-performance modeling. Based on the KT results, we also suggest three potential applications of our model.
引用
收藏
页码:2012 / 2045
页数:34
相关论文
共 50 条
  • [11] A Deep Memory-Aware Attentive Model for Knowledge Tracing
    Shi, Juntai
    Su, Wei
    Liu, Lei
    Xu, Shenglin
    Huang, Tianyuan
    Liu, Jiamin
    Yue, Wenli
    Li, Shihua
    2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 1581 - 1590
  • [12] A self-attentive model for tracing knowledge and engagement in parallel
    Jiang, Hua
    Xiao, Bing
    Luo, Yintao
    Ma, Junliang
    PATTERN RECOGNITION LETTERS, 2023, 165 : 25 - 32
  • [13] Psychological factors enhanced heterogeneous learning interactive graph knowledge tracing for understanding the learning process
    Wang, Zhifeng
    Wu, Wanxuan
    Zeng, Chunyan
    Luo, Heng
    Sun, Jianwen
    FRONTIERS IN PSYCHOLOGY, 2024, 15
  • [14] Enhanced Learning Behaviors and Ability Knowledge Tracing
    Ma, Fanglan
    Zhu, Changsheng
    Lei, Peng
    Yuan, Peiwen
    APPLIED SCIENCES-BASEL, 2025, 15 (02):
  • [15] Commonsense knowledge enhanced event graph representation learning for script event prediction
    Li, Xiang
    Jiang, Xinxi
    Zhou, Qifeng
    MACHINE LEARNING, 2025, 114 (03)
  • [16] INCREMENTAL CONTEXT AWARE ATTENTIVE KNOWLEDGE TRACING
    Wong, Cheryl Sze Yin
    Yang, Guo
    Chen, Nancy F.
    Savitha, Ramasamy
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3993 - 3997
  • [17] ELAKT: Enhancing Locality for Attentive Knowledge Tracing
    Pu, Yanjun
    Liu, Fang
    Shi, Rongye
    Yuan, Haitao
    Chen, Ruibo
    Peng, Tianhao
    Wu, Wenjun
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (04)
  • [18] Adversarial Bootstrapped Question Representation Learning for Knowledge Tracing
    Sun, Jianwen
    Yu, Fenghua
    Liu, Sannyuya
    Luo, Yawei
    Liang, Ruxia
    Shen, Xiaoxuan
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 8016 - 8025
  • [19] Graph-Enhanced Multi-Activity Knowledge Tracing
    Zhao, Siqian
    Sahebi, Shaghayegh
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 529 - 546
  • [20] A Temporal-Enhanced Model for Knowledge Tracing
    Cui, Shaoguo
    Wang, Mingyang
    Xu, Song
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT IX, 2024, 15024 : 407 - 421