Examining Differential Item Functions of Different Item Ordered Test Forms According to Item Difficulty Levels

被引:5
|
作者
Cokluk, Omay [1 ]
Gul, Emrah [2 ]
Dogan-Gul, Cilem [3 ]
机构
[1] Ankara Univ, Fac Educ Sci, Dept Measurement & Evaluat, TR-06100 Ankara, Turkey
[2] Hakkari Univ, Dept Measurement & Evaluat, Hakkari, Turkey
[3] Ankara Univ, Dept Measurement & Evaluat, TR-06100 Ankara, Turkey
来源
EDUCATIONAL SCIENCES-THEORY & PRACTICE | 2016年 / 16卷 / 01期
关键词
Achievement test; Test form; Item order; Item difficulty; Classical test theory; Item response theory; Differential item function; R programming language;
D O I
10.12738/estp.2016.1.0329
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
The study aims to examine whether differential item function is displayed in three different test forms that have item orders of random and sequential versions (easy-to-hard and hard-to-easy), based on Classical Test Theory (CTT) and Item Response Theory (IRT) methods and bearing item difficulty levels in mind. In the correlational research, the data from a total of 578 seventh graders were gathered using an Atomic Structures Achievement Test. R programming language and "difR" package were employed for all the analyses. As a result of the analyses, it was concluded that a comparison of IRT- and CTT-based methods indicate a greater number of items with distinctively significant differential item functioning. Different item ordering leads students at the same ability levels to display different performances on the same items. As a result, it is found that item order differentiates the probability of correct response to the items for those at the same ability levels. A test form of sequential easy-to-hard questions brings more advantages than that of a hard-to-easy sequence or a random version. The findings show that it is essential to arrange tests that are employed to make decisions about people in consideration with psychometric principles.
引用
收藏
页码:319 / 330
页数:12
相关论文
共 50 条
  • [41] Examining the Flynn Effect in the General Social Survey Vocabulary test using item response theory
    Beaujean, A. Alexander
    Sheng, Yanyan
    PERSONALITY AND INDIVIDUAL DIFFERENCES, 2010, 48 (03) : 294 - 298
  • [42] Wald χ2 Test for Differential Item Functioning Detection with Polytomous Items in Multilevel Data
    Huang, Sijia
    Valdivia, Dubravka Svetina
    EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2024, 84 (03) : 530 - 548
  • [43] The Effect of Item Pools of Different Strengths on the Test Results of Computerized-Adaptive Testing
    Kezer, Fatih
    INTERNATIONAL JOURNAL OF ASSESSMENT TOOLS IN EDUCATION, 2021, 8 (01): : 145 - 155
  • [44] A comparison of IRT-based methods for examining differential item functioning in TIMSS 2011 mathematics subtest
    Ozdemir, Burhanettin
    INTERNATIONAL CONFERENCE ON NEW HORIZONS IN EDUCATION, INTE 2014, 2015, 174 : 2075 - 2083
  • [45] Comparison of test equating methods based on item response theory according to the sample size and ability distribution
    Kilmen, Sevilay
    Demirtasli, Nukhet
    4TH WORLD CONFERENCE ON EDUCATIONAL SCIENCES (WCES-2012), 2012, 46 : 130 - 134
  • [46] Latent variable mixture models to test for differential item functioning: a population-based analysis
    Xiuyun Wu
    Richard Sawatzky
    Wilma Hopman
    Nancy Mayo
    Tolulope T. Sajobi
    Juxin Liu
    Jerilynn Prior
    Alexandra Papaioannou
    Robert G. Josse
    Tanveer Towheed
    K. Shawn Davison
    Lisa M. Lix
    Health and Quality of Life Outcomes, 15
  • [47] Latent variable mixture models to test for differential item functioning: a population-based analysis
    Wu, Xiuyun
    Sawatzky, Richard
    Hopman, Wilma
    Mayo, Nancy
    Sajobi, Tolulope T.
    Liu, Juxin
    Prior, Jerilynn
    Papaioannou, Alexandra
    Josse, Robert G.
    Towheed, Tanveer
    Davison, K. Shawn
    Lix, Lisa M.
    HEALTH AND QUALITY OF LIFE OUTCOMES, 2017, 15
  • [48] Examinee Characteristics and their Impact on the Psychometric Properties of a Multiple Choice Test According to the Item Response Theory (IRT)
    Almaleki, Deyab
    ENGINEERING TECHNOLOGY & APPLIED SCIENCE RESEARCH, 2021, 11 (02) : 6889 - 6901
  • [49] Unidimensional IRT Item Parameter Estimates Across Equivalent Test Forms With Confounding Specifications Within Dimensions
    Matlock, Ki Lynn
    Turner, Ronna
    EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2016, 76 (02) : 258 - 279
  • [50] Identifying meaningful change on PROMIS short forms in cancer patients: a comparison of item response theory and classic test theory frameworks
    Minji K. Lee
    John D. Peipert
    David Cella
    Kathleen J. Yost
    David T. Eton
    Paul J. Novotny
    Jeff A. Sloan
    Amylou C. Dueck
    Quality of Life Research, 2023, 32 : 1355 - 1367