Model Selection and Psychological Theory: A Discussion of the Differences Between the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC)

被引:1267
|
作者
Vrieze, Scott I. [1 ,2 ]
机构
[1] Univ Minnesota, Dept Psychol, Minneapolis, MN 55455 USA
[2] Minneapolis VA Med Ctr, Minneapolis, MN USA
关键词
Akaike information criterion; Bayesian information criterion; model selection; factor analysis; theory testing; LATENT CLASSES; APPROXIMATIONS; DIMENSION; CHOICE; INDEX;
D O I
10.1037/a0027127
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
This article reviews the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) in model selection and the appraisal of psychological theory. The focus is on latent variable models, given their growing use in theory testing and construction. Theoretical statistical results in regression are discussed, and more important issues are illustrated with novel simulations involving latent variable models including factor analysis, latent profile analysis, and factor mixture models. Asymptotically, the BIC is consistent, in that it will select the true model if, among other assumptions, the true model is among the candidate models considered. The AIC is not consistent under these circumstances. When the true model is not in the candidate model set the AIC is efficient, in that it will asymptotically choose whichever model minimizes the mean squared error of prediction/estimation. The BIC is not efficient under these circumstances. Unlike the BIC, the AIC also has a minimax property, in that it can minimize the maximum possible risk in finite sample sizes. In sum, the AIC and BIC have quite different properties that require different assumptions, and applied researchers and methodologists alike will benefit from improved understanding of the asymptotic and finite-sample behavior of these criteria. The ultimate decision to use the AIC or BIC depends on many factors, including the loss function employed, the study's methodological design, the substantive research question, and the notion of a true model and its applicability to the study at hand.
引用
收藏
页码:228 / 243
页数:16
相关论文
共 50 条
  • [41] The effective sample size in Bayesian information criterion for level-specific fixed and random-effect selection in a two-level nested model
    Cho, Sun-Joo
    Wu, Hao
    Naveiras, Matthew
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2024, 77 (02) : 289 - 315
  • [42] On a Generalized Burr Life-Testing Model: Characterization, Reliability, Simulation, and Akaike Information Criterion
    Ahsanullah, M.
    Shakil, M.
    Kibria, B. M. Golam
    Elgarhy, M.
    JOURNAL OF STATISTICAL THEORY AND APPLICATIONS, 2019, 18 (03): : 259 - 269
  • [43] On a Generalized Burr Life-Testing Model: Characterization, Reliability, Simulation, and Akaike Information Criterion
    M. Ahsanullah
    M. Shakil
    B. M. Golam Kibria
    M. Elgarhy
    Journal of Statistical Theory and Applications, 2019, 18 : 259 - 269
  • [44] A NOTE ON DIFFERENT SELECTION OF BEST-FITTING MODEL BY LIKELIHOOD RATIO TEST AND AKAIKE INFORMATION CRITERION FOR THE ANALYSIS OF CONTINGENCY TABLES
    Miyamoto, Nobuko
    Shinohara, Satoshi
    Inoue, Akira
    Tomizawa, Sadao
    ADVANCES AND APPLICATIONS IN STATISTICS, 2005, 5 (03) : 301 - 312
  • [45] Information geometric model selection criterion and its application in cognition
    Liu, Yun-Hui
    Luo, Si-Wei
    Lv, Zi-Ang
    Huang, Hua
    PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2006, : 2814 - +
  • [46] Selection of tuning parameters in bridge regression models via Bayesian information criterion
    Shuichi Kawano
    Statistical Papers, 2014, 55 : 1207 - 1223
  • [47] Selection of tuning parameters in bridge regression models via Bayesian information criterion
    Kawano, Shuichi
    STATISTICAL PAPERS, 2014, 55 (04) : 1207 - 1223
  • [48] Focused vector information criterion model selection and model averaging regression with missing response
    Sun, Zhimeng
    Su, Zhi
    Ma, Jingyi
    METRIKA, 2014, 77 (03) : 415 - 432
  • [49] Model Order Determination for Signed Measurements via the Bayesian Information Criterion
    Li, Changheng
    Zhang, Rong
    Li, Jian
    Stoica, Petre
    2018 IEEE 10TH SENSOR ARRAY AND MULTICHANNEL SIGNAL PROCESSING WORKSHOP (SAM), 2018, : 366 - 370
  • [50] A Note on Comparing the Bifactor and Second-Order Factor Models: Is the Bayesian Information Criterion a Routinely Dependable Index for Model Selection?
    Raykov, Tenko
    DiStefano, Christine
    Calvocoressi, Lisa
    EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2024, 84 (02) : 271 - 288