A Kullback's symmetric divergence criterion with application to linear regression and time series model

被引:0
作者
Belkacemi, Hocine [1 ]
Seghouane, Abed-Krim [1 ]
机构
[1] CNRS, Supelec, Lab Signaux & Syst, F-91192 Gif Sur Yvette, France
来源
2005 IEEE/SP 13th Workshop on Statistical Signal Processing (SSP), Vols 1 and 2 | 2005年
关键词
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The Kullback information criterion (KIC) is a recently developed tool for statistical model selection. KIC serves as an asymptotically unbiased estimator of the Kullback symmetric divergence, known as J-divergence. A Corrected version for KIC denoted by KICC have been also proposed to correct the bias of KIC. This version tends to overfit when the sample size increases. In this paper we propose an alternative to KICC, the KICU criterion which is unbiased estimator of the Kullback's symmetric divergence. It provides better model choice than KICC for moderate to large sample size.
引用
收藏
页码:508 / 511
页数:4
相关论文
共 8 条
[1]   NEW LOOK AT STATISTICAL-MODEL IDENTIFICATION [J].
AKAIKE, H .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (06) :716-723
[2]   A large-sample model selection criterion based on Kullback's symmetric divergence [J].
Cavanaugh, JE .
STATISTICS & PROBABILITY LETTERS, 1999, 42 (04) :333-343
[3]  
HURVICH CM, 1989, BIOMETRIKA, V76, P297, DOI 10.2307/2336663
[4]   MODELING BY SHORTEST DATA DESCRIPTION [J].
RISSANEN, J .
AUTOMATICA, 1978, 14 (05) :465-471
[5]   ESTIMATING DIMENSION OF A MODEL [J].
SCHWARZ, G .
ANNALS OF STATISTICS, 1978, 6 (02) :461-464
[6]   A small sample model selection criterion based on Kullback's symmetric divergence [J].
Seghouane, AK ;
Bekara, M .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (12) :3314-3323
[7]  
SHUMWAY R, 1997, STAT PROBABILITYY LE, V34, P285
[8]   FURTHER ANALYSIS OF DATA BY AKAIKES INFORMATION CRITERION AND FINITE CORRECTIONS [J].
SUGIURA, N .
COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1978, 7 (01) :13-26