A criterion for vector autoregressive model selection based on Kullback's symmetric divergence

被引:0
作者
Seghouane, AK [1 ]
机构
[1] Natl ICT Australis Ltd, Canberra, ACT 2601, Australia
来源
2005 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1-5: SPEECH PROCESSING | 2005年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Kullback Information Criterion, KIC and its univariate bias-corrected version, KICc are two recently developed criteria for model selection. In this paper, a small sample model selection criterion for vector autoregressive models is developed. The proposed criterion is named KICvc where the notation "vc" stands for vector correction, and it can be considered as an extension of KIC for vector autoregressive models. KICvc is an unbiased estimator of a variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Simulation results shows that the proposed criterion estimates the model order more accurately than any other asymptotically efficient method when applied to vector autoregressive model selection in small samples.
引用
收藏
页码:97 / 100
页数:4
相关论文
共 15 条