A small sample model selection criterion based on Kullback's symmetric divergence

被引:0
作者
Seghouane, AK [1 ]
Bekara, M [1 ]
Fleury, G [1 ]
机构
[1] SUPELEC, Serv Mesures, F-91192 Gif Sur Yvette, France
来源
2003 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL VI, PROCEEDINGS: SIGNAL PROCESSING THEORY AND METHODS | 2003年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Kullback information criterion KIC is a recently developed tool for statistical model selection [1]. KIC serves as an asymptotically unbiased estimator of a variant of the Kullback symmetric divergence, known also as J-divergence. In this paper a bias correction of the Kullback symmetric information criterion is derived for linear models. The correction is of particular use when the sample size is small or when the number of fitted parameters is of moderate to large fraction of the sample size. For linear regression models, the corrected method called KICc is an exactly unbiased estimator of a variant of the Kullback symmetric divergence between the true unknown model and the candidate fitted model. Furthermore KICc is found to provide better model order choice than any other asymptotically efficient methods in an application to autoregressive time series models.
引用
收藏
页码:145 / 148
页数:4
相关论文
共 15 条