Knowledge structure enhanced graph representation learning model for attentive knowledge tracing

被引:29
|
作者
Gan, Wenbin [1 ]
Sun, Yuan [1 ]
Sun, Yi [2 ]
机构
[1] Sokendai, Natl Inst Informat, Tokyo, Japan
[2] Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing, Peoples R China
基金
日本学术振兴会;
关键词
cognitive question difficulty; graph representation learning; intelligent tutoring systems; knowledge structure discovery; knowledge tracing; learner proficiency estimation;
D O I
10.1002/int.22763
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge tracing (KT) is a fundamental personalized-tutoring technique for learners in online learning systems. Recent KT methods employ flexible deep neural network-based models that excel at this task. However, the adequacy of KT is still challenged by the sparseness of the learners' exercise data. To alleviate the sparseness problem, most of the exiting KT studies are performed at the skill-level rather than the question-level, as questions are often numerous and associated with much fewer skills. However, at the skill level, KT neglects the distinctive information related to the questions themselves and their relations. In this case, the models can imprecisely infer the learners' knowledge states and might fail to capture the long-term dependencies in the exercising sequences. In the knowledge domain, skills are naturally linked as a graph (with the edges being the prerequisite relations between pedagogical concepts). We refer to such a graph as a knowledge structure (KS). Incorporating a KS into the KT procedure can potentially resolve both the sparseness and information loss, but this avenue has been underexplored because obtaining the complete KS of a domain is challenging and labor-intensive. In this paper, we propose a novel KS-enhanced graph representation learning model for KT with an attention mechanism (KSGKT). We first explore eight methods that automatically infer the domain KS from learner response data and integrate it into the KT procedure. Leveraging a graph representation learning model, we then obtain the question and skill embeddings from the KS-enhanced graph. To incorporate more distinctive information on the questions, we extract the cognitive question difficulty from the learning history of each learner. We then propose a convolutional representation method that fuses these disctinctive features, thus obtaining a comprehensive representation of each question. These representations are input to the proposed KT model, and the long-term dependencies are handled by the attention mechanism. The model finally predicts the learner's performance on new problems. Extensive experiments conducted from six perspectives on three real-world data sets demonstrated the superiority and interpretability of our model for learner-performance modeling. Based on the KT results, we also suggest three potential applications of our model.
引用
收藏
页码:2012 / 2045
页数:34
相关论文
共 50 条
  • [31] Graph representation learning via simple jumping knowledge networks
    Yang, Fei
    Zhang, Huyin
    Tao, Shiming
    Hao, Sheng
    APPLIED INTELLIGENCE, 2022, 52 (10) : 11324 - 11342
  • [32] Graph representation learning via simple jumping knowledge networks
    Fei Yang
    Huyin Zhang
    Shiming Tao
    Sheng Hao
    Applied Intelligence, 2022, 52 : 11324 - 11342
  • [33] Forgetting-aware Linear Bias for Attentive Knowledge Tracing
    Im, Yoonjin
    Choi, Eunseong
    Kook, Heejin
    Lee, Jongwuk
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3958 - 3962
  • [34] GKT-CD: Make Cognitive Diagnosis Model Enhanced by Graph-based Knowledge Tracing
    Zhang, Junrui
    Mo, Yun
    Chen, Changzhi
    He, Xiaofeng
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [35] MAN: Memory-augmented Attentive Networks for Deep Learning-based Knowledge Tracing
    He, Liangliang
    Li, Xiao
    Wang, Pancheng
    Tang, Jintao
    Wang, Ting
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (01)
  • [36] Contrastive Learning for Knowledge Tracing
    Lee, Wonsung
    Chun, Jaeyoon
    Lee, Youngmin
    Park, Kyoungsoo
    Park, Sungrae
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2330 - 2338
  • [37] Research on Deep Knowledge Tracing Model Integrating Graph Attention Network
    Zhao, Zhongyuan
    Liu, Zhaohui
    Wang, Bei
    Ouyang, Lijun
    Wang, Can
    Ouyang, Yan
    2022 PROGNOSTICS AND HEALTH MANAGEMENT CONFERENCE, PHM-LONDON 2022, 2022, : 389 - 394
  • [38] Fusing hybrid attentive network with self-supervised dual-channel heterogeneous graph for knowledge tracing
    Wu, Tangjie
    Ling, Qiang
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 225
  • [39] Deep Knowledge Tracing Based on Spatial and Temporal Representation Learning for Learning Performance Prediction
    Lyu, Liting
    Wang, Zhifeng
    Yun, Haihong
    Yang, Zexue
    Li, Ya
    APPLIED SCIENCES-BASEL, 2022, 12 (14):
  • [40] A survey on deep learning based knowledge tracing
    Song, Xiangyu
    Li, Jianxin
    Cai, Taotao
    Yang, Shuiqiao
    Yang, Tingting
    Liu, Chengfei
    KNOWLEDGE-BASED SYSTEMS, 2022, 258