An Empirical Comparison of Information-Theoretic Selection Criteria for Multivariate Behavior Genetic Models

被引:0
|
作者
Kristian E. Markon
Robert F. Krueger
机构
[1] University of Minnesota,Department of Psychology
来源
Behavior Genetics | 2004年 / 34卷
关键词
Akaike's Information Criterion (AIC); Bayesian Information Criterion (BIC); Minimum Description Length (MDL); model selection; Monte Carlo;
D O I
暂无
中图分类号
学科分类号
摘要
Information theory provides an attractive basis for statistical inference and model selection. However, little is known about the relative performance of different information-theoretic criteria in covariance structure modeling, especially in behavioral genetic contexts. To explore these issues, information-theoretic fit criteria were compared with regard to their ability to discriminate between multivariate behavioral genetic models under various model, distribution, and sample size conditions. Results indicate that performance depends on sample size, model complexity, and distributional specification. The Bayesian Information Criterion (BIC) is more robust to distributional misspecification than Akaike's Information Criterion (AIC) under certain conditions, and outperforms AIC in larger samples and when comparing more complex models. An approximation to the Minimum Description Length (MDL; Rissanen, J. (1996). IEEE Transactions on Information Theory 42:40–47, Rissanen, J. (2001). IEEE Transactions on Information Theory 47:1712–1717) criterion, involving the empirical Fisher information matrix, exhibits variable patterns of performance due to the complexity of estimating Fisher information matrices. Results indicate that a relatively new information-theoretic criterion, Draper's Information Criterion (DIC; Draper, 1995), which shares features of the Bayesian and MDL criteria, performs similarly to or better than BIC. Results emphasize the importance of further research into theory and computation of information-theoretic criteria.
引用
收藏
页码:593 / 610
页数:17
相关论文
共 28 条