Approaches to Item-Level Data with Cross-Classified Structure: An Illustration with Student Evaluation of Teaching

被引:0
作者
Huang, Sijia [1 ]
机构
[1] Indiana Univ, Sch Educ, Bloomington, IN 47405 USA
关键词
Student evaluation of teaching (SET); cross-classified data; multilevel item response theory model; higher education; item response theory; MAXIMUM-LIKELIHOOD; RELIABILITY; VALIDITY; MODEL; PARAMETERS; BIASES;
D O I
10.1080/00273171.2023.2288589
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Student evaluation of teaching (SET) questionnaires are ubiquitously applied in higher education institutions in North America for both formative and summative purposes. Data collected from SET questionnaires are usually item-level data with cross-classified structure, which are characterized by multivariate categorical outcomes (i.e., multiple Likert-type items in the questionnaires) and cross-classified structure (i.e., non-nested students and instructors). Recently, a new approach, namely the cross-classified IRT model, was proposed for appropriately handling SET data. To inform researchers in higher education, in this article, the cross-classified IRT model, along with three existing approaches applied in SET studies, including the cross-classified random effects model (CCREM), the multilevel item response theory (MLIRT) model, and a two-step integrated strategy, was reviewed. The strengths and weaknesses of each of the four approaches were also discussed. Additionally, the new and existing approaches were compared through an empirical data analysis and a preliminary simulation study. This article concluded by providing general suggestions to researchers for analyzing SET data and discussing limitations and future research directions.
引用
收藏
页码:523 / 542
页数:20
相关论文
共 54 条
[1]   RATING FORMULATION FOR ORDERED RESPONSE CATEGORIES [J].
ANDRICH, D .
PSYCHOMETRIKA, 1978, 43 (04) :561-573
[2]  
[Anonymous], 2008, Generalized, Linear, and Mixed Models, DOI DOI 10.1198/TECH.2003.S13
[3]   Multilevel IRT models for the university teaching evaluation [J].
Bacci, Silvia ;
Caviezel, Valeria .
JOURNAL OF APPLIED STATISTICS, 2011, 38 (12) :2775-2791
[4]   Fitting Linear Mixed-Effects Models Using lme4 [J].
Bates, Douglas ;
Maechler, Martin ;
Bolker, Benjamin M. ;
Walker, Steven C. .
JOURNAL OF STATISTICAL SOFTWARE, 2015, 67 (01) :1-48
[5]   MARGINAL MAXIMUM-LIKELIHOOD ESTIMATION OF ITEM PARAMETERS - APPLICATION OF AN EM ALGORITHM [J].
BOCK, RD ;
AITKIN, M .
PSYCHOMETRIKA, 1981, 46 (04) :443-459
[6]   Gender biases in student evaluations of teaching [J].
Boring, Anne .
JOURNAL OF PUBLIC ECONOMICS, 2017, 145 :27-41
[7]   Taking the grading leniency story to the edge. The influence of student, teacher, and course characteristics on student evaluations of teaching in higher education [J].
Brockx, Bert ;
Spooren, Pieter ;
Mortelmans, Dimitri .
EDUCATIONAL ASSESSMENT EVALUATION AND ACCOUNTABILITY, 2011, 23 (04) :289-306
[8]  
Cai L, 2008, THESIS U N CAROLINA
[9]   Metropolis-Hastings Robbins-Monro Algorithm for Confirmatory Item Factor Analysis [J].
Cai, Li .
JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS, 2010, 35 (03) :307-335
[10]   HIGH-DIMENSIONAL EXPLORATORY ITEM FACTOR ANALYSIS BY A METROPOLIS-HASTINGS ROBBINS-MONRO ALGORITHM [J].
Cai, Li .
VOX SANGUINIS, 2010, 98 :33-57