DIF Detection for Multiple Groups: Comparing Three-Level GLMMs and Multiple-Group IRT Models

被引:0
作者
Koehler, Carmen [1 ]
Hartig, Johannes [1 ]
Khorramdel, Lale [2 ,3 ]
Pokropek, Artur [4 ]
机构
[1] Leibniz Inst Res & Informat Educ, DIPF, Rostocker Str 6, D-60323 Frankfurt, Germany
[2] Boston Coll, TIMSS Int Study Ctr, Chestnut Hill, MA 02467 USA
[3] Boston Coll, PIRLS Int Study Ctr, Chestnut Hill, MA 02467 USA
[4] Educ Res Inst, Educ Res Res Programme, Ljubljana, Slovenia
关键词
MEASUREMENT INVARIANCE;
D O I
10.1111/jedm.12384
中图分类号
G44 [教育心理学];
学科分类号
0402 ; 040202 ;
摘要
For assessment scales applied to different groups (e.g., students from different states; patients in different countries), multigroup differential item functioning (MG-DIF) needs to be evaluated in order to ensure that respondents with the same trait level but from different groups have equal response probabilities on a particular item. The current study compares two approaches for DIF detection: a multiple-group item response theory (MG-IRT) model and a generalized linear mixed model (GLMM). In the MG-IRT model approach, item parameters are constrained to be equal across groups and DIF is evaluated for each item in each group. In the GLMM, groups are treated as random, and item difficulties are modeled as correlated random effects with a joint multivariate normal distribution. Its nested structure allows the estimation of item difficulty variances and covariances at the group level. We use an excerpt from the PISA 2015 reading domain as an exemplary empirical investigation, and conduct a simulation study to compare the performance of the two approaches. Results from the empirical investigation show that the detection of countries with DIF is similar in both approaches. Results from the simulation study confirm this finding and indicate slight advantages of the MG-IRT model approach.
引用
收藏
页码:325 / 344
页数:20
相关论文
共 30 条
[1]  
Asparouhov T., 2014, AUXILIARY VARIABLES, DOI DOI 10.1080/10705511.2014.915181
[2]   Examining Sources of DIF in Psychological and Educational Assessment Using Multilevel Logistic Regression [J].
Balluerka, Nekane ;
Plewis, Ian ;
Gorostiaga, Arantxa ;
Padilla, Jose-Luis .
METHODOLOGY-EUROPEAN JOURNAL OF RESEARCH METHODS FOR THE BEHAVIORAL AND SOCIAL SCIENCES, 2014, 10 (02) :71-79
[3]   Fitting Linear Mixed-Effects Models Using lme4 [J].
Bates, Douglas ;
Maechler, Martin ;
Bolker, Benjamin M. ;
Walker, Steven C. .
JOURNAL OF STATISTICAL SOFTWARE, 2015, 67 (01) :1-48
[4]  
BIRNBAUM A, 1968, STAT THEORIES MENTAL, DOI DOI 10.1002/J.2333-8504.1981.TB01255.X
[5]   What makes the difference? The impact of item properties on mode effects in reading assessments [J].
Buerger, Sarah ;
Kroehne, Ulf ;
Koehler, Carmen ;
Goldhammer, Frank .
STUDIES IN EDUCATIONAL EVALUATION, 2019, 62 :1-9
[6]   TESTING FOR THE EQUIVALENCE OF FACTOR COVARIANCE AND MEAN STRUCTURES - THE ISSUE OF PARTIAL MEASUREMENT INVARIANCE [J].
BYRNE, BM ;
SHAVELSON, RJ ;
MUTHEN, B .
PSYCHOLOGICAL BULLETIN, 1989, 105 (03) :456-466
[7]   Relaxing measurement invariance in cross-national consumer research using a hierarchical IRT model [J].
De Jong, Martijn G. ;
Steenkamp, Jan-Benedict E. M. ;
Fox, Jean-Paul .
JOURNAL OF CONSUMER RESEARCH, 2007, 34 (02) :260-278
[8]  
Fox J.-P., 2020, Behaviormetrika, V47, P385, DOI DOI 10.1007/S41237-020-00119-3
[9]  
Hartig J.., 2020, Psychological Test and Assessment Modeling, V62, P11
[10]   DETECTION OF DIFFERENTIAL ITEM FUNCTIONING IN MULTIPLE GROUPS [J].
KIM, SH ;
COHEN, AS ;
PARK, TH .
JOURNAL OF EDUCATIONAL MEASUREMENT, 1995, 32 (03) :261-276