A criterion for model selection in the presence incomplete data based on Kullback's symmetric divergence

被引:0
作者
Seghouane, AK [1 ]
Bekara, M [1 ]
Fleury, G [1 ]
机构
[1] Ecole Super Elect, Serv Mesures, F-91192 Gif Sur Yvette, France
来源
BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING | 2004年 / 707卷
关键词
D O I
暂无
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A criterion is proposed for model selection in the presence of incomplete data. It's construction is based on the motivations provided for the KIC criterion that has been recently developed and for the PDIO criterion. The proposed criterion serves as an asymptotically unbiased estimator of the complete data Kullback-Leibler symmetric divergence between a candidate model and the generating model. It is therefore a natural extension of KIC to settings where the observed data is incomplete and is equivalent to KIC when there is no missing data. The proposed criterion differs from PDIO (predictive divergence for incomplete observation models) in its goodness of fit term and its complexity term. Unlike AIC, KIC and PDIO this criterion can be evaluated using only complete data tools, readily available through the EM and SEM algorithms. The performance of the proposed criterion relative the ones of PDIO, KIC and AIC are examined in a simulation study.
引用
收藏
页码:429 / 441
页数:13
相关论文
共 50 条
  • [41] Abnormality detection based on the Kullback-Leibler divergence for generalized Gaussian data
    Xiong, Ying
    Jing, Yindi
    Chen, Tongwen
    CONTROL ENGINEERING PRACTICE, 2019, 85 : 257 - 270
  • [42] Unsupervised Weight Parameter Estimation for Exponential Mixture Distribution based on Symmetric Kullback-Leibler Divergence
    Uchida, Masato
    2014 JOINT 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 15TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2014, : 1126 - 1129
  • [43] Unsupervised Weight Parameter Estimation for Exponential Mixture Distribution Based on Symmetric Kullback-Leibler Divergence
    Uchida, Masato
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2015, E98A (11): : 2349 - 2353
  • [44] Criterion for selection of model and controller design based on I/O data
    Tsumura, K
    Kimura, H
    PROCEEDINGS OF THE 39TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-5, 2000, : 2837 - 2842
  • [45] Formal and informal model selection with incomplete data
    Verbeke, Geert
    Molenberghs, Geert
    Beunckens, Caroline
    STATISTICAL SCIENCE, 2008, 23 (02) : 201 - 218
  • [46] Training Data Selection for Acoustic Modeling via Submodular Optimization of Joint Kullback-Leibler Divergence
    Asami, Taichi
    Masumura, Ryo
    Masataki, Hirokazu
    Okamoto, Manabu
    Sakauchi, Sumitaka
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 3645 - 3649
  • [47] Optimal Viewpoint Selection Based on Aesthetic Composition Evaluation Using Kullback-Leibler Divergence
    Lan, Kai
    Sekiyama, Kosuke
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2016, PT I, 2016, 9834 : 433 - 443
  • [48] A model selection criterion for model-based clustering of annotated gene expression data
    Gallopin, Melina
    Celeux, Gilles
    Jaffrezic, Florence
    Rau, Andrea
    STATISTICAL APPLICATIONS IN GENETICS AND MOLECULAR BIOLOGY, 2015, 14 (05) : 413 - 428
  • [49] A similarity measure based on Kullback-Leibler divergence for collaborative filtering in sparse data
    Deng, Jiangzhou
    Wang, Yong
    Guo, Junpeng
    Deng, Yongheng
    Gao, Jerry
    Park, Younghee
    JOURNAL OF INFORMATION SCIENCE, 2019, 45 (05) : 656 - 675
  • [50] Comparison of the Model Selection Criteria for Multiple Regression Based on Kullback-Leibler's Information
    Keerativibool, Warangkhana
    Siripanich, Pachitjanut
    CHIANG MAI JOURNAL OF SCIENCE, 2017, 44 (02): : 699 - 714