A Pre-trained Knowledge Tracing Model with Limited Data

被引:0
|
作者
Yue, Wenli [1 ,3 ]
Su, Wei [1 ,3 ]
Liu, Lei [2 ]
Cai, Chuan [1 ]
Yuan, Yongna [1 ]
Jia, Zhongfeng [1 ]
Liu, Jiamin [1 ]
Xie, Wenjian [1 ]
机构
[1] Lanzhou Univ, Sch Informat Sci & Engn, Lanzhou, Peoples R China
[2] Duzhe Publishing Grp Co Ltd, Lanzhou, Peoples R China
[3] Key Lab Media Convergence Technol & Commun, Lanzhou, Gansu, Peoples R China
来源
DATABASE AND EXPERT SYSTEMS APPLICATIONS, PT I, DEXA 2024 | 2024年 / 14910卷
关键词
Knowledge Tracing; Limited Data; Pre-training; Fine-tuning;
D O I
10.1007/978-3-031-68309-1_14
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Online education systems have gained increasing popularity due to their capability to fully preserve users' learning data. This advantage enables researchers to assess learners' mastery through their learning trajectories, thereby facilitating personalized education and support. Knowledge tracing, an effective educational aid, simulates students' implicit knowledge states and predicts their mastery over knowledge based on their historical answer records. However, for newly developed online learning platforms, the lack of sufficient historical answer data may impede accurate prediction of students' knowledge states, rendering existing knowledge tracing models less effective. This paper introduces the first pre-trained knowledge tracing model that leverages a substantial amount of existing data for pre-training and a smaller dataset for fine-tuning. Validated across several publicly available knowledge tracing datasets, our method demonstrates significant improvement in tracing performance on small datasets, with a maximum AUC increase of 5.07%. Beyond incorporating small datasets, our approach of pre-training the entire dataset has shown an enhanced AUC compared to the baseline, marking a novel direction in knowledge tracing research. Furthermore, the paper analyzed the outcomes of pre-training experiments with varying numbers of interactions as fine-tuning datasets, providing valuable insights for Intelligent Tutoring Systems (ITS).
引用
收藏
页码:163 / 178
页数:16
相关论文
共 50 条
  • [1] Fine-tuning pre-trained voice conversion model for adding new target speakers with limited data
    Koshizuka, Takeshi
    Ohmura, Hidefumi
    Katsurada, Kouichi
    INTERSPEECH 2021, 2021, : 1339 - 1343
  • [2] Hyperbolic Pre-Trained Language Model
    Chen, Weize
    Han, Xu
    Lin, Yankai
    He, Kaichen
    Xie, Ruobing
    Zhou, Jie
    Liu, Zhiyuan
    Sun, Maosong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 3101 - 3112
  • [3] A Pre-trained Universal Knowledge Graph Reasoning Model Based on Rule Prompts
    Cui, Yuanning
    Sun, Zequn
    Hu, Wei
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (08): : 2030 - 2044
  • [4] Billion-scale Pre-trained E-commerce Product Knowledge Graph Model
    Zhang, Wen
    Wong, Chi-Man
    Ye, Ganqiang
    Wen, Bo
    Zhang, Wei
    Chen, Huajun
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 2476 - 2487
  • [5] Prediction of degradation for mRNA vaccines based on pre-trained model
    Fan, Jixiang
    Liu, Chuang
    Zhang, Jianzhang
    Zhan, Xiuxiu
    Hu, Huajun
    PROCEEDINGS OF 2024 4TH INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND INTELLIGENT COMPUTING, BIC 2024, 2024, : 455 - 459
  • [6] Fine-Tuned Pre-Trained Model for Script Recognition
    Bisht, Mamta
    Gupta, Richa
    INTERNATIONAL JOURNAL OF MATHEMATICAL ENGINEERING AND MANAGEMENT SCIENCES, 2021, 6 (05) : 1297 - 1314
  • [7] Pre-trained transformers: an empirical comparison
    Casola, Silvia
    Lauriola, Ivano
    Lavelli, Alberto
    MACHINE LEARNING WITH APPLICATIONS, 2022, 9
  • [8] Biomedical-domain pre-trained language model for extractive summarization
    Du, Yongping
    Li, Qingxiao
    Wang, Lulin
    He, Yanqing
    KNOWLEDGE-BASED SYSTEMS, 2020, 199 (199)
  • [9] Automatic Fixation of Decompilation Quirks Using Pre-trained Language Model
    Kaichi, Ryunosuke
    Matsumoto, Shinsuke
    Kusumoto, Shinji
    PRODUCT-FOCUSED SOFTWARE PROCESS IMPROVEMENT, PROFES 2023, PT I, 2024, 14483 : 259 - 266
  • [10] Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data
    Nagasawa, Junichi
    Nakata, Yuichi
    Hiroe, Mamoru
    Zheng, Yujia
    Kawaguchi, Yutaka
    Maegawa, Yuji
    Hojo, Naoki
    Takiguchi, Tetsuya
    Nakayama, Minoru
    Uchimura, Maki
    Sonoda, Yuma
    Kowa, Hisatomo
    Nagamatsu, Takashi
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,