Proper and Useful Distractors in Multiple-Choice Diagnostic Classification Models

被引:1
|
作者
Kohn, Hans Friedrich [1 ]
Chiu, Chia-Yi [2 ]
Wang, Yu [2 ]
机构
[1] Univ Illinois, Dept Psychol, Urbana, IL 61801 USA
[2] Univ Minnesota Twin Cities, Educ Psychol, Minneapolis, MN USA
来源
QUANTITATIVE PSYCHOLOGY | 2023年 / 422卷
关键词
Cognitive diagnosis; Nonparametric cognitive diagnosis; Polytomous items; MC-DINA; MC-NPC; COGNITIVE DIAGNOSIS;
D O I
10.1007/978-3-031-27781-8_9
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The multiple-choice (MC) item format has been implemented in educational assessments that are used across diverse content domains. MC items comprise two components: the stem that provides the context with a motivating narrative, and the collection of response options consisting of the correct answer, called the "key," and several incorrect alternatives, the "distractors." The MC-DINA model was the first diagnostic classification model for MC items that used distractors explicitly as potential sources of diagnostic information. However, the MC-DINA model requires that the q-vectors of the distractors are nested within each other and that of the key, which poses a serious constraint on item development. Consequently, later adaptations of the MC item format to cognitive diagnosis dropped the nestedness condition. The relaxation of the nestedness-condition, however, comes at a price: distractors may become redundant (i.e., they do not contribute to any further diagnostic differentiation between examinees), and they may induce undesirable diagnostic ambiguity (i.e., they are equally likely to be chosen by an examinee, but their q-vectors point at different diagnostic classifications). In this article, two criteria, useful and proper, are proposed to identify redundant and diagnostically ambiguous distractors.
引用
收藏
页码:97 / 106
页数:10
相关论文
共 26 条
  • [1] Cognitive diagnostic models for tests with multiple-choice and constructed-response items
    Kuo, Bor-Chen
    Chen, Chun-Hua
    Yang, Chih-Wei
    Mok, Magdalena Mo Ching
    EDUCATIONAL PSYCHOLOGY, 2016, 36 (06) : 1115 - 1133
  • [2] Nonparametric Classification Method for Multiple-Choice Items in Cognitive Diagnosis
    Wang, Yu
    Chiu, Chia-Yi
    Koehn, Hans Friedrich
    JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS, 2023, 48 (02) : 189 - 219
  • [3] DINA Models for Multiple-Choice Items With Few Parameters: Considering Incorrect Answers
    Ozaki, Koken
    APPLIED PSYCHOLOGICAL MEASUREMENT, 2015, 39 (06) : 431 - 447
  • [4] Cognitive Diagnosis Testlet Model for Multiple-Choice Items
    Guo, Lei
    Zhou, Wenjie
    Li, Xiao
    JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS, 2024, 49 (01) : 32 - 60
  • [5] A mixture item response model for multiple-choice data
    Bolt, DM
    Cohen, AS
    Wollack, JA
    JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS, 2001, 26 (04) : 381 - 409
  • [6] A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options
    de la Torre, Jimmy
    APPLIED PSYCHOLOGICAL MEASUREMENT, 2009, 33 (03) : 163 - 183
  • [7] Modeling Hierarchical Attribute Structures in Diagnostic Classification Models with Multiple Attempts
    Kwon, Tae Yeon
    Huggins-Manley, A. Corinne
    Templin, Jonathan
    Zheng, Mingying
    JOURNAL OF EDUCATIONAL MEASUREMENT, 2024, 61 (02) : 198 - 218
  • [8] Identifiability of Diagnostic Classification Models
    Xu, Gongjun
    Zhang, Stephanie
    PSYCHOMETRIKA, 2016, 81 (03) : 625 - 649
  • [9] Computerized Adaptive Testing for Cognitively Based Multiple-Choice Data
    Yigit, Hulya D.
    Sorrel, Miguel A.
    de la Torre, Jimmy
    APPLIED PSYCHOLOGICAL MEASUREMENT, 2019, 43 (05) : 388 - 401
  • [10] Invariance Properties for General Diagnostic Classification Models
    Bradshaw, Laine P.
    Madison, Matthew J.
    INTERNATIONAL JOURNAL OF TESTING, 2016, 16 (02) : 99 - 118