Response Demands of Reading Comprehension Test Items: A Review of Item Difficulty Modeling Studies

被引:2
|
作者
Ferrara, Steve [1 ]
Steedle, Jeffrey T. [2 ]
Frantz, Roger S. [3 ]
机构
[1] Cognia Learning Grp, 9115 Westside Pkwy, Alpharetta, GA 30009 USA
[2] ACT Inc, Res, Iowa City, IA USA
[3] NWEA, MetaMetr Res & Dev, Portland, OR USA
关键词
DOCUMENT LITERACY; PERFORMANCE; VARIABLES;
D O I
10.1080/08957347.2022.2103135
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Item difficulty modeling studies involve (a) hypothesizing item features, or item response demands, that are likely to predict item difficulty with some degree of accuracy; and (b) entering the features as independent variables into a regression equation or other statistical model to predict difficulty. In this review, we report findings from 13 empirical item difficulty modeling studies of reading comprehension tests. We define reading comprehension item response demands as reading passage variables (e.g., length, complexity), passage-by-item variables (e.g., degree of correspondence between item and text, type of information requested), and item stem and response option variables. We report on response demand variables that are related to item difficulty and illustrate how they can be used to manage item difficulty in construct-relevant ways so that empirical item difficulties are within a targeted range (e.g., located within the Proficient or other proficiency level range on a test's IRT scale, where intended).
引用
收藏
页码:237 / 253
页数:17
相关论文
共 50 条
  • [1] Item difficulty modeling of paragraph comprehension items
    Gorin, Joanna S.
    Embretson, Susan E.
    APPLIED PSYCHOLOGICAL MEASUREMENT, 2006, 30 (05) : 394 - 411
  • [2] Modeling Local Item Dependence in Cloze and Reading Comprehension Test Items Using Testlet Response Theory
    Baghaei, Purya
    Ravand, Hamdollah
    PSICOLOGICA, 2016, 37 (01): : 85 - 104
  • [3] READING-COMPREHENSION TEST ITEM DIFFICULTY AS A FUNCTION OF COGNITIVE PROCESSING VARIABLES
    GARRISON, W
    DOWALIBY, F
    LONG, G
    AMERICAN ANNALS OF THE DEAF, 1992, 137 (01) : 22 - 30
  • [4] Flexible Item Response Modeling for Timed Reading Comprehension Assessment
    Forthmann, Boris
    Lenhard, Wolfgang
    Lenhard, Alexandra
    Foerster, Natalie
    JOURNAL OF EXPERIMENTAL EDUCATION, 2024,
  • [5] Decomposing Difficulty of Reading Literacy Test Items
    Ivanova, Alina Ye.
    Antipkina, Inna, V
    VOPROSY OBRAZOVANIYA-EDUCATIONAL STUDIES MOSCOW, 2023, (03):
  • [6] Modeling Item Difficulty in a Dynamic Test
    Hauenstein, Clifford E.
    Embretson, Susan E.
    JOURNAL OF COGNITIVE EDUCATION AND PSYCHOLOGY, 2020, 19 (02): : 93 - 106
  • [7] ITEM BIAS IN A TEST OF READING-COMPREHENSION
    LINN, RL
    LEVINE, MV
    HASTINGS, CN
    WARDROP, JL
    APPLIED PSYCHOLOGICAL MEASUREMENT, 1981, 5 (02) : 159 - 173
  • [8] Modeling the difficulty of quantitative reasoning items: Implications for item generation
    Enright, MK
    Sheehan, KM
    ITEM GENERATION FOR TEST DEVELOPMENT, 2002, : 129 - 157
  • [9] DO COMPREHENSION ITEMS REALLY TEST READING - SOMETIMES
    JOHNS, JL
    JOURNAL OF READING, 1978, 21 (07): : 615 - 619
  • [10] Reading comprehension without reading? On the construct validity of multiple-choice reading comprehension test items
    Rost, Detlef H.
    Sparfeldt, Joern R.
    ZEITSCHRIFT FUR PADAGOGISCHE PSYCHOLOGIE, 2007, 21 (3-4): : 305 - 314