Deep Knowledge Tracing Model with an Evolved Transformer Structure

被引:0
|
作者
Li, Zhijun [1 ]
Xue, Zixiao [1 ]
Liu, Chen [1 ]
Feng, Yanzhang [1 ]
机构
[1] North China Univ Technol, Sch Elect & Control Engn, Beijing 100144, Peoples R China
关键词
Deep knowledge tracing; Transformer; Hybrid attention mechanism; Interpretability;
D O I
10.1109/DDCLS58216.2023.10167354
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep Learning based Knowledge Tracing (DKT) has become a research hotspot in the intelligence education field. Compared to conventional methods, DKT has better predictive performance, but it also has some problems such as poor interpretability, and difficulties in reflecting the causal association between the learning process and test results. In this paper, a new DKT model is proposed based on an evolved Transformer structure (DKT-ETS). The encoder layer is composed of three coding networks with a multi-head self-attention mechanism, while inputs are three types of pre-processed data: process characteristic data, test label data, and answer results data. Output as three matrices of V, Q, K. The decoder layer also uses the attention mechanism, in which the input is the three matrices that come from encoder, and the output is the predicted result. By improving the structure, the new model introduces certain interpretability into the V, Q and K matrices of the attention mechanism. Thus, the causal relationship between the learning process and test results can be reflected a certain extent: the V matrix represents the characteristic information of the testee's learning process; the Q matrix reflects the knowledge point information examined by the current test item; and the K matrix represents the results of the previous tests. DKT-ETS was validated by using the large-scale knowledge tracking data set EdNet, and the results show that its ACC and AUC evaluation indicators have been significantly improved.
引用
收藏
页码:1586 / 1592
页数:7
相关论文
共 50 条
  • [1] Deep Attentive Model for Knowledge Tracing
    Wang, Xinping
    Chen, Liangyu
    Zhang, Min
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 10192 - 10199
  • [2] Deep Knowledge Tracing
    Piech, Chris
    Bassen, Jonathan
    Huang, Jonathan
    Ganguli, Surya
    Sahami, Mehran
    Guibas, Leonidas
    Sohl-Dickstein, Jascha
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [3] HiTSKT: A hierarchical transformer model for session-aware knowledge tracing
    Ke, Fucai
    Wang, Weiqing
    Tan, Weicong
    Du, Lan
    Jin, Yuan
    Huang, Yujin
    Yin, Hongzhi
    KNOWLEDGE-BASED SYSTEMS, 2024, 284
  • [4] Deep Knowledge Tracing and Dynamic Student Classification for Knowledge Tracing
    Minn, Sein
    Yu, Yi
    Desmarais, Michel C.
    Zhu, Feida
    Vie, Jill-Jenn
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 1182 - 1187
  • [5] Skill-Oriented Hierarchical Structure for Deep Knowledge Tracing
    Yang, Zhenyuan
    Xu, Shimeng
    Wang, Changbo
    He, Gaoqi
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 425 - 432
  • [6] A Deep Memory-Aware Attentive Model for Knowledge Tracing
    Shi, Juntai
    Su, Wei
    Liu, Lei
    Xu, Shenglin
    Huang, Tianyuan
    Liu, Jiamin
    Yue, Wenli
    Li, Shihua
    2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 1581 - 1590
  • [7] A Confusion-Enhanced Deep Learning Model for Knowledge Tracing
    Yin, Ming
    Huang, Ruihe
    2024 4TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND ARTIFICIAL INTELLIGENCE, CCAI 2024, 2024, : 258 - 262
  • [8] Interpretable Deep Knowledge Tracing
    Liu K.
    Li X.
    Tang J.
    Zhao X.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2021, 58 (12): : 2618 - 2629
  • [9] Deep Knowledge Tracing with Transformers
    Pu, Shi
    Yudelson, Michael
    Ou, Lu
    Huang, Yuchi
    ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2020), PT II, 2020, 12164 : 252 - 256
  • [10] Federated Deep Knowledge Tracing
    Wu, Jinze
    Huang, Zhenya
    Liu, Qi
    Lian, Defu
    Wang, Hao
    Chen, Enhong
    Ma, Haiping
    Wang, Shijin
    WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 662 - 670