A model selection approach to signal denoising using Kullback's symmetric divergence

被引:5
|
作者
Bekara, Maiza
Knockaert, Luc
Seghouane, Abd-Krim
Fleury, Gilles
机构
[1] Ecol Super Elect Serv Mesures, F-91192 Gif Sur Yvette, France
[2] IMEC INTEC UGENT, B-900 Ghent, Belgium
关键词
signal denoising; model selection; information criterion;
D O I
10.1016/j.sigpro.2005.03.023
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider the determination of a soft/hard coefficients threshold for signal recovery embedded in additive Gaussian noise. This is closely related to the problem of variable selection in linear regression. Viewing the denoising problem as a model selection one, we propose a new information theoretical model selection approach to signal denoising. We first construct a statistical model for the unknown signal and then try to find the best approximating model (corresponding to the denoised signal) from a set of candidates. We adopt the Kullback's symmetric divergence as a measure of similarity between the unknown model and the candidate model. The best approximating model is the one that minimizes an unbiased estimator of this divergence. The advantage of a denoising method based on model selection over classical thresholding approaches, resides in the fact that the threshold is determined automatically without the need to estimate the noise variance. The proposed denoising method, called KICc-denoising (Kullback Information Criterion corrected) is compared with cross validation (CV), minimum description length (MDL) and the classical methods SureShrink and VisuShrink via a simulation study based on three different type of signals: chirp, seismic and piecewise polynomial. (C) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:1400 / 1409
页数:10
相关论文
共 50 条
  • [21] Pseudo-Online Classification of Mental Tasks Using Kullback-Leibler Symmetric Divergence
    Benevides, Alessandro B.
    Bastos Filho, Teodiano F.
    Sarcinelli Filho, Mario
    JOURNAL OF MEDICAL AND BIOLOGICAL ENGINEERING, 2012, 32 (06) : 411 - 416
  • [22] Speech enhancement using a wavelet thresholding method based on symmetric Kullback-Leibler divergence
    Tabibian, Shima
    Akbari, Ahmad
    Nasersharif, Babak
    SIGNAL PROCESSING, 2015, 106 : 184 - 197
  • [23] HYPERSPECTRAL BAND SELECTION USING KULLBACK-LEIBLER DIVERGENCE FOR BLUEBERRY FRUIT DETECTION
    Yang, Ce
    Lee, Won Suk
    Gader, Paul
    Li, Han
    2013 5TH WORKSHOP ON HYPERSPECTRAL IMAGE AND SIGNAL PROCESSING: EVOLUTION IN REMOTE SENSING (WHISPERS), 2013,
  • [24] Feature Selection Algorithm for Hierarchical Text Classification Using Kullback-Leibler Divergence
    Yao Lifang
    Qin Sijun
    Zhu Huan
    2017 2ND IEEE INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYSIS (ICCCBDA 2017), 2017, : 421 - 424
  • [25] Dynamic fine-tuning layer selection using Kullback-Leibler divergence
    Wanjiku, Raphael Ngigi
    Nderu, Lawrence
    Kimwele, Michael
    ENGINEERING REPORTS, 2023, 5 (05)
  • [26] On Bayesian selection of the best normal population using the Kullback-Leibler divergence measure
    Thabane, L
    Haq, MS
    STATISTICA NEERLANDICA, 1999, 53 (03) : 342 - 360
  • [27] Extraction method for signal effective component based on extreme-point symmetric mode decomposition and Kullback–Leibler divergence
    Yong Zhu
    Shengnan Tang
    Lingxiao Quan
    Wanlu Jiang
    Ling Zhou
    Journal of the Brazilian Society of Mechanical Sciences and Engineering, 2019, 41
  • [28] Speech Denoising Using Non-negative Matrix Factorization with Kullback-Leibler Divergence and Sparseness Constraints
    Ludena-Choez, Jimmy
    Gallardo-Antolin, Ascension
    ADVANCES IN SPEECH AND LANGUAGE TECHNOLOGIES FOR IBERIAN LANGUAGES, 2012, 328 : 207 - 216
  • [29] Predictability of Ensemble Forecasting Estimated Using the Kullback–Leibler Divergence in the Lorenz Model
    Ruiqiang DING
    Baojia LIU
    Bin GU
    Jianping LI
    Xuan LI
    Advances in Atmospheric Sciences, 2019, 36 (08) : 837 - 846
  • [30] Optimal Viewpoint Selection Based on Aesthetic Composition Evaluation Using Kullback-Leibler Divergence
    Lan, Kai
    Sekiyama, Kosuke
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2016, PT I, 2016, 9834 : 433 - 443