A criterion for model selection in the presence of incomplete data based on Kullback's symmetric divergence

被引:19
|
作者
Seghouane, AK [1 ]
Bekara, M [1 ]
Fleury, G [1 ]
机构
[1] Ecole Super Elect, Serv Mesures, F-91192 Gif Sur Yvette, France
关键词
model selection; Kullback-Leibler information; AIC(cd) criterion; PDIO criterion; KIC criterion; EM algorithm; SEM algorithm;
D O I
10.1016/j.sigpro.2005.02.004
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A criterion is proposed for model selection in the presence of incomplete data. It's construction is based on the motivations provided for the KIC criterion that has been recently developed and for the PDIO (predictive divergence for incomplete observation models) criterion. The proposed criterion serves as an asymptotically unbiased estimator of the complete data Kullback-Leibler symmetric divergence between a candidate model and the generating model. It is therefore a natural extension of KIC to settings where the observed data is incomplete and is equivalent to KIC when there is no missing data. The proposed criterion differs from PDIO in its goodness of fit term and its complexity term, but it differs from AIC(cd) (where the notation "cd" stands for "complete data") only in its complexity term. Unlike AIC, KIC and PD10 this criterion can be evaluated using only complete data tools, readily available through the EM and SEM algorithms. The performance of the proposed criterion relative to other well-known criteria are examined in a simulation study. (c) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:1405 / 1417
页数:13
相关论文
共 50 条
  • [1] A criterion for model selection in the presence incomplete data based on Kullback's symmetric divergence
    Seghouane, AK
    Bekara, M
    Fleury, G
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2004, 707 : 429 - 441
  • [2] A bootstrap model selection criterion based on Kullback's symmetric divergence
    Seghouane, AK
    De Lathawer, L
    PROCEEDINGS OF THE 2003 IEEE WORKSHOP ON STATISTICAL SIGNAL PROCESSING, 2003, : 494 - 497
  • [3] A criterion for vector autoregressive model selection based on Kullback's symmetric divergence
    Seghouane, AK
    2005 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1-5: SPEECH PROCESSING, 2005, : 97 - 100
  • [4] A small sample model selection criterion based on Kullback's symmetric divergence
    Seghouane, AK
    Bekara, M
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (12) : 3314 - 3323
  • [5] A small sample model selection criterion based on Kullback's symmetric divergence
    Seghouane, AK
    Bekara, M
    Fleury, G
    2003 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL VI, PROCEEDINGS: SIGNAL PROCESSING THEORY AND METHODS, 2003, : 145 - 148
  • [6] A large-sample model selection criterion based on Kullback's symmetric divergence
    Cavanaugh, JE
    STATISTICS & PROBABILITY LETTERS, 1999, 42 (04) : 333 - 343
  • [7] Model Selection Criterion Based on Kullback-Leibler's Symmetric Divergence for Simultaneous Equations Model
    Keerativibool, Warangkhana
    Jitthavech, Jirawan
    CHIANG MAI JOURNAL OF SCIENCE, 2015, 42 (03): : 761 - 773
  • [8] Criteria for linear model selection based on Kullback's symmetric divergence
    Cavanaugh, JE
    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2004, 46 (02) : 257 - 274
  • [9] A linear vector model selection criterion based on Kullback's-Leibler divergence
    Seghouane, AK
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2004, 735 : 573 - 580
  • [10] A Kullback's symmetric divergence criterion with application to linear regression and time series model
    Belkacemi, Hocine
    Seghouane, Abed-Krim
    2005 IEEE/SP 13th Workshop on Statistical Signal Processing (SSP), Vols 1 and 2, 2005, : 508 - 511