A criterion for model selection in the presence incomplete data based on Kullback's symmetric divergence

被引:0
|
作者
Seghouane, AK [1 ]
Bekara, M [1 ]
Fleury, G [1 ]
机构
[1] Ecole Super Elect, Serv Mesures, F-91192 Gif Sur Yvette, France
来源
BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING | 2004年 / 707卷
关键词
D O I
暂无
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A criterion is proposed for model selection in the presence of incomplete data. It's construction is based on the motivations provided for the KIC criterion that has been recently developed and for the PDIO criterion. The proposed criterion serves as an asymptotically unbiased estimator of the complete data Kullback-Leibler symmetric divergence between a candidate model and the generating model. It is therefore a natural extension of KIC to settings where the observed data is incomplete and is equivalent to KIC when there is no missing data. The proposed criterion differs from PDIO (predictive divergence for incomplete observation models) in its goodness of fit term and its complexity term. Unlike AIC, KIC and PDIO this criterion can be evaluated using only complete data tools, readily available through the EM and SEM algorithms. The performance of the proposed criterion relative the ones of PDIO, KIC and AIC are examined in a simulation study.
引用
收藏
页码:429 / 441
页数:13
相关论文
共 50 条
  • [21] Kullback-Leibler divergence-based ASR training data selection
    Gouvea, Evandro
    Davel, Marelie H.
    12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, 2011, : 2308 - +
  • [22] HMM-based hierarchical unit selection combining Kullback-Leibler divergence with likelihood criterion
    Ling, Zhen-Hua
    Wang, Ren-Hua
    2007 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL IV, PTS 1-3, 2007, : 1245 - +
  • [23] An information criterion for model selection with missing data via complete-data divergence
    Shimodaira, Hidetoshi
    Maeda, Haruyoshi
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2018, 70 (02) : 421 - 438
  • [24] An information criterion for model selection with missing data via complete-data divergence
    Hidetoshi Shimodaira
    Haruyoshi Maeda
    Annals of the Institute of Statistical Mathematics, 2018, 70 : 421 - 438
  • [25] Point Set Registration Method Based on Symmetric Kullback-Leibler Divergence
    Yang Xiaoyan
    LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (08)
  • [26] Minimising the Kullback-Leibler Divergence for Model Selection in Distributed Nonlinear Systems
    Cliff, Oliver M.
    Prokopenko, Mikhail
    Fitch, Robert
    ENTROPY, 2018, 20 (02):
  • [27] Markov-switching model selection using Kullback-Leibler divergence
    Smith, Aaron
    Naik, Prasad A.
    Tsai, Chih-Ling
    JOURNAL OF ECONOMETRICS, 2006, 134 (02) : 553 - 577
  • [28] Mutual information criterion for feature selection from incomplete data
    Qian, Wenbin
    Shu, Wenhao
    NEUROCOMPUTING, 2015, 168 : 210 - 220
  • [29] An Information Criterion for Auxiliary Variable Selection in Incomplete Data Analysis
    Imori, Shinpei
    Shimodaira, Hidetoshi
    ENTROPY, 2019, 21 (03):
  • [30] Biological Data Outlier Detection Based on Kullback-Leibler Divergence
    Oh, Jung Hun
    Gao, Jean
    Rosenblatt, Kevin
    2008 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, PROCEEDINGS, 2008, : 249 - +