An improved Akaike information criterion for state-space model selection

被引:66
作者
Bengtsson, Thomas
Cavanaugh, Joseph E. [1 ]
机构
[1] Univ Iowa, Dept Biostat, Iowa City, IA 52242 USA
[2] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
基金
美国国家科学基金会;
关键词
AIC; Kullback-Leibler information; Kullback's directed divergence; state-space model; time series analysis;
D O I
10.1016/j.csda.2005.05.003
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Following the work of Hurvich, Shumway, and Tsai [1990, Improved estimators of Kullback-Leibler information for autoregressive model selection in small samples. Biometrika 77, 709-719], we propose an "improved" variant of the Akaike information criterion, AICi, for state-space model selection. The variant is based on Akaike's [ 1973, Information theory and an extension of maximum likelihood principle. Second International Symposium on Information Theory, Akademia Kiado, pp. 267-281] objective of estimating the Kullback-Leibler information [Kullback, 1968, Information Theory and Statistics. Dover, New York] between the densities corresponding to the fitted model and the generating or true model. The development of AICi proceeds by decomposing the expected information into two terms. The first term suggests that the empirical log likelihood can be used to form a biased estimator of the information, the second term provides the bias adjustment. Exact computation of the bias adjustment requires the values of the true model parameters, which are inaccessible in practical applications. Yet for fitted models in the candidate class that are correctly specified or overfit, the adjustment is asymptotically independent of the true parameters. Thus, in certain settings, the adjustment may be estimated via Monte Carlo simulations by using conveniently chosen simulation parameters as proxies for the true parameters. We present simulation results to evaluate the performance of AICi both as an estimator of the Kullback-Leibler information and as a model selection criterion. Our results indicate that AICi estimates the information with less bias than traditional AIC. Furthermore, AICi serves as an effective tool for selecting a model of appropriate dimension. (C) 2005 Elsevier B.V All rights reserved.
引用
收藏
页码:2635 / 2654
页数:20
相关论文
共 25 条
[1]   FITTING AUTOREGRESSIVE MODELS FOR PREDICTION [J].
AKAIKE, H .
ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 1969, 21 (02) :243-&
[2]   BAYESIAN-ANALYSIS OF MINIMUM AIC PROCEDURE [J].
AKAIKE, H .
ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 1978, 30 (01) :9-14
[3]   NEW LOOK AT STATISTICAL-MODEL IDENTIFICATION [J].
AKAIKE, H .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (06) :716-723
[4]  
Akaike H., 1973, 2 INT S INFORM THEOR, P267, DOI [DOI 10.1007/978-1-4612-1694-0_15, 10.1007/978-1-4612-1694-0_15]
[5]  
[Anonymous], 1999, APPL MULTIVARIATE AN
[6]  
Brockwell P. J., 1991, TIME SERIES THEORY M
[7]  
Cavanaugh JE, 1997, STAT SINICA, V7, P473
[8]   Unifying the derivations for the Akaike and corrected Akaike information criteria [J].
Cavanaugh, JE .
STATISTICS & PROBABILITY LETTERS, 1997, 33 (02) :201-208
[9]   SMOOTHING AND INTERPOLATION WITH THE STATE-SPACE MODEL [J].
DEJONG, P .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1989, 84 (408) :1085-1088
[10]  
HANNAN EJ, 1979, J ROY STAT SOC B MET, V41, P190