Item Response Ranking for Cognitive Diagnosis

被引:0
作者
Tong, Shiwei [1 ,2 ]
Liu, Qi [1 ,2 ]
Yu, Runlong [1 ,2 ]
Huang, Wei [1 ,2 ]
Huang, Zhenya [1 ,2 ]
Pardos, Zachary A. [3 ]
Jiang, Weijie [3 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Technol, Anhui Prov Key Lab Big Data Anal & Applicat, Langfang, Peoples R China
[2] Univ Sci & Technol China, Sch Data Sci, Langfang, Peoples R China
[3] Univ Calif Berkeley, Berkeley, CA USA
来源
PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021 | 2021年
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cognitive diagnosis, a fundamental task in education area, aims at providing an approach to reveal the proficiency level of students on knowledge concepts. Actually, monotonicity is one of the basic conditions in cognitive diagnosis theory, which assumes that student's proficiency is monotonic with the probability of giving the right response to a test item. However, few of previous methods consider the monotonicity during optimization. To this end, we propose Item Response Ranking framework (IRR), aiming at introducing pairwise learning into cognitive diagnosis to well model the monotonicity between item responses. Specifically, we first use an item specific sampling method to sample item responses and construct response pairs based on their partial order, where we propose the two-branch sampling methods to handle the unobserved responses. After that, we use a pairwise objective function to exploit the monotonicity in the pair formulation. In fact, IRR is a general framework which can be applied to most of contemporary cognitive diagnosis models. Extensive experiments demonstrate the effectiveness and interpretability of our method.
引用
收藏
页码:1750 / 1756
页数:7
相关论文
共 50 条
  • [31] New Item Selection Methods for Cognitive Diagnosis Computerized Adaptive Testing
    Kaplan, Mehmet
    de la Torre, Jimmy
    Ramon Barrada, Juan
    APPLIED PSYCHOLOGICAL MEASUREMENT, 2015, 39 (03) : 167 - 188
  • [32] Understanding examinees’ item responses through cognitive modeling of response accuracy and response times
    Susan Embretson
    Large-scale Assessments in Education, 11
  • [33] Understanding examinees' item responses through cognitive modeling of response accuracy and response times
    Embretson, Susan
    LARGE-SCALE ASSESSMENTS IN EDUCATION, 2023, 11 (01)
  • [34] A Cognitive Diagnosis Model for Continuous Response
    Minchen, Nathan D.
    de la Torre, Jimmy
    Liu, Ying
    JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS, 2017, 42 (06) : 651 - 677
  • [35] Sensitivity of the Informant Questionnaire on Cognitive Decline: An application of item response theory
    Butt, Zeeshan
    AGING NEUROPSYCHOLOGY AND COGNITION, 2008, 15 (05) : 642 - 655
  • [36] A Grouped Ranking Model for Item Preference Parameter
    Hino, Hideitsu
    Fujimoto, Yu
    Murata, Noboru
    NEURAL COMPUTATION, 2010, 22 (09) : 2417 - 2451
  • [37] Cross Pairwise Ranking for Unbiased Item Recommendation
    Wan, Qi
    He, Xiangnan
    Wang, Xiang
    Wu, Jiancan
    Guo, Wei
    Tang, Ruiming
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2370 - 2378
  • [38] Weighted Item Ranking for Pairwise Matrix Factorization
    Zhang, Haiyang
    Ganchev, Ivan
    Nikolov, Nikola S.
    O'Droma, Mairtin
    2017 SOUTH EASTERN EUROPEAN DESIGN AUTOMATION, COMPUTER ENGINEERING, COMPUTER NETWORKS AND SOCIAL MEDIA CONFERENCE (SEEDA-CECNSM), 2017, : 43 - 47
  • [39] Dissecting the expanded cognitive reflection test: an item response theory analysis
    Srol, Jakub
    JOURNAL OF COGNITIVE PSYCHOLOGY, 2018, 30 (07) : 643 - 655
  • [40] Privacy-Preserving Fair Item Ranking
    Sun, Jia Ao
    Pentyala, Sikha
    De Cock, Martine
    Farnadi, Golnoosh
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2023, PT II, 2023, 13981 : 188 - 203