Group-Level Cognitive Diagnosis: A Multi-Task Learning Perspective

被引:8
|
作者
Huang, Jie [1 ]
Liu, Qi [1 ]
Wang, Fei [1 ]
Huang, Zhenya [1 ]
Fang, Songtao [1 ]
Wu, Runze [2 ]
Chen, Enhong [1 ]
Su, Yu [1 ,3 ]
Wang, Shijin [3 ]
机构
[1] Univ Sci & Technol China, Anhui Prov Key Lab Big Data Anal & Applicat Sch C, Hefei, Anhui, Peoples R China
[2] NetEase Inc, Fuxi AI Lab, Hangzhou, Peoples R China
[3] IFLYTEK Res, Hefei, Anhui, Peoples R China
来源
2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021) | 2021年
基金
中国国家自然科学基金;
关键词
Group-Level Cognitive Diagnosis; Multi-Task Learning; Attention Mechanism; Data Sparsity; DINA MODEL;
D O I
10.1109/ICDM51629.2021.00031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most cognitive diagnosis research in education has been concentrated on individual assessment, aiming at discovering the latent characteristics of students. However, in many real-world scenarios, group-level assessment is an important and meaningful task, e.g., class assessment in different regions can discover the difference of teaching level in different contexts. In this work, we consider assessing cognitive ability for a group of students, which aims to mine groups' proficiency on specific knowledge concepts. The significant challenge in this task is the sparsity of group-exercise response data, which seriously affects the assessment performance. Existing works either do not make effective use of additional student-exercise response data or fail to reasonably model the relationship between group ability and individual ability in different learning contexts, resulting in sub-optimal diagnosis results. To this end, we propose a general Multi-Task based Group-Level Cognitive Diagnosis (MGCD) framework, which is featured with three special designs: 1) We jointly model student-exercise responses and group-exercise responses in a multi-task manner to alleviate the sparsity of group-exercise responses; 2) We design a context-aware attention network to model the relationship between student knowledge state and group knowledge state in different contexts; 3) We model an interpretable cognitive layer to obtain student ability, group ability and exercise factors (e.g., difficulty), and then we leverage neural networks to learn complex interaction functions among them. Extensive experiments on real-world datasets demonstrate the generality of MGCD and the effectiveness of our attention design and multi-task learning.
引用
收藏
页码:210 / 219
页数:10
相关论文
共 50 条
  • [21] Boosted multi-task learning
    Chapelle, Olivier
    Shivaswamy, Pannagadatta
    Vadrevu, Srinivas
    Weinberger, Kilian
    Zhang, Ya
    Tseng, Belle
    MACHINE LEARNING, 2011, 85 (1-2) : 149 - 173
  • [22] Parallel Multi-Task Learning
    Zhang, Yu
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 629 - 638
  • [23] Survey of Multi-Task Learning
    Zhang Y.
    Liu J.-W.
    Zuo X.
    1600, Science Press (43): : 1340 - 1378
  • [24] A Survey on Multi-Task Learning
    Zhang, Yu
    Yang, Qiang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (12) : 5586 - 5609
  • [25] Toward Effective Personalized Service QoS Prediction From the Perspective of Multi-Task Learning
    Lian, Huiqiang
    Li, Jiahui
    Wu, Hao
    Zhao, Yiji
    Zhang, Lei
    Wang, Xin
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2023, 20 (03): : 2587 - 2597
  • [26] Multi-Task Learning for Abstractive and Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    DATA SCIENCE AND ENGINEERING, 2019, 4 (01) : 14 - 23
  • [27] MNCM: Multi-level Network Cascades Model for Multi-Task Learning
    Wu, Haotian
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4565 - 4569
  • [28] Multi-Task Learning for Abstractive and Extractive Summarization
    Yangbin Chen
    Yun Ma
    Xudong Mao
    Qing Li
    Data Science and Engineering, 2019, 4 (1) : 14 - 23
  • [29] Deep multi-level networks with multi-task learning for saliency detection
    Zhang, Lihe
    Fang, Xiang
    Bo, Hongguang
    Wang, Tiantian
    Lu, Huchuan
    NEUROCOMPUTING, 2018, 312 : 229 - 238
  • [30] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING
    Meng, Ze
    Yao, Xin
    Sun, Lifeng
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393