EKT: Exercise-Aware Knowledge Tracing for Student Performance Prediction

被引:261
作者
Liu, Qi [1 ]
Huang, Zhenya [1 ]
Yin, Yu [1 ]
Chen, Enhong [1 ]
Xiong, Hui [2 ]
Su, Yu [3 ]
Hu, Guoping [3 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Techonol, Anhui Prov Key Lab Big Data Anal & Applicat, Hefei 230026, Anhui, Peoples R China
[2] Rutgers State Univ, Rutgers Business Sch, Management Sci & Informat Syst Dept, Newark, NJ 07102 USA
[3] IFLYTEK Co Ltd, iFLYTEK Res, Hefei 230088, Anhui, Peoples R China
基金
中国国家自然科学基金; 国家教育部科学基金资助;
关键词
Intelligent education; knowledge tracing; exercise content; knowledge concept; MODEL;
D O I
10.1109/TKDE.2019.2924374
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
For offering proactive services (e.g., personalized exercise recommendation) to the students in computer supported intelligent education, one of the fundamental tasks is predicting student performance (e.g., scores) on future exercises, where it is necessary to track the change of each student's knowledge acquisition during her exercising activities. Unfortunately, to the best of our knowledge, existing approaches can only exploit the exercising records of students, and the problem of extracting rich information existed in the materials (e.g., knowledge concepts, exercise content) of exercises to achieve both more precise prediction of student performance and more interpretable analysis of knowledge acquisition remains underexplored. To this end, in this paper, we present a holistic study of student performance prediction. To directly achieve the primary goal of performance prediction, we first propose a general Exercise-Enhanced Recurrent Neural Network (EERNN) framework by exploring both student's exercising records and the text content of corresponding exercises. In EERNN, we simply summarize each student's state into an integrated vector and trace it with a recurrent neural network, where we design a bidirectional LSTM to learn the encoding of each exercise from its content. For making final predictions, we design two implementations on the basis of EERNN with different prediction strategies, i.e., EERNNM with Markov property and EERNNA with Attention mechanism. Then, to explicitly track student's knowledge acquisition on multiple knowledge concepts, we extend EERNN to an explainable Exercise-aware Knowledge Tracing (EKT) framework by incorporating the knowledge concept information, where the student's integrated state vector is now extended to a knowledge state matrix. In EKT, we further develop a memory network for quantifying how much each exercise can affect the mastery of students on multiple knowledge concepts during the exercising process. Finally, we conduct extensive experiments and evaluate both EERNN and EKT frameworks on a largescale real-world data. The results in both general and cold-start scenarios clearly demonstrate the effectiveness of two frameworks in student performance prediction as well as the superior interpretability of EKT.
引用
收藏
页码:100 / 115
页数:16
相关论文
共 58 条
  • [1] Anderson A, 2014, WWW'14: PROCEEDINGS OF THE 23RD INTERNATIONAL CONFERENCE ON WORLD WIDE WEB, P687
  • [2] [Anonymous], 2000, HLTH OUTC METH S
  • [3] [Anonymous], 2014, Educational Data Mining
  • [4] Baker R.S., 2009, Journal of Educational Data Mining, V1, P3, DOI [10.5281/zenodo.3554657, DOI 10.5281/ZENODO.3554657]
  • [5] Bridges B.K., 2011, ASHE higher education report, V116
  • [6] Cen H, 2006, LECT NOTES COMPUT SC, V4053, P164
  • [7] Prerequisite-Driven Deep Knowledge Tracing
    Chen, Penghe
    Lu, Yu
    Zheng, Vincent W.
    Pian, Yang
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 39 - 48
  • [8] Tracking Knowledge Proficiency of Students with Educational Priors
    Chen, Yuying
    Liu, Qi
    Huang, Zhenya
    Wu, Le
    Chen, Enhong
    Wu, Runze
    Su, Yu
    Hu, Guoping
    [J]. CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 989 - 998
  • [9] Chung J., 2014, ARXIV, DOI DOI 10.48550/ARXIV.1412.3555
  • [10] CORBETT AT, 1994, USER MODEL USER-ADAP, V4, P253, DOI 10.1007/BF01099821