Euclid in a Taxicab: Sparse Blind Deconvolution with Smoothed l1/l2 Regularization

被引:78
|
作者
Repetti, Audrey [1 ]
Mai Quyen Pham [1 ,2 ]
Duval, Laurent [2 ]
Chouzenoux, Emilie [1 ]
Pesquet, Jean-Christophe [1 ]
机构
[1] Univ Paris Est, LIGM UMR CNRS 8049, F-77454 Champs Sur Marne, France
[2] IFP Energies Nouvelles, F-92500 Rueil Malmaison, France
关键词
Blind deconvolution; nonconvex optimization; norm ratio; preconditioned forward-backward algorithm; seismic data processing; sparsity; smoothed l(1)/l(2) regularization; COORDINATE DESCENT METHOD; NONNEGATIVE MATRIX; FACTORIZATION; CONVERGENCE; SIGNALS;
D O I
10.1109/LSP.2014.2362861
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The l(1)/l(2) ratio regularization function has shown good performance for retrieving sparse signals in a number of recent works, in the context of blind deconvolution. Indeed, it benefits from a scale invariance property much desirable in the blind context. However, the l(1)/l(2) function raises some difficulties when solving the nonconvex and nonsmooth minimization problems resulting from the use of such a penalty term in current restoration methods. In this paper, we propose a new penalty based on a smooth approximation to the l(1)/l(2) function. In addition, we develop a proximal-based algorithm to solve variational problems involving this function and we derive theoretical convergence results. We demonstrate the effectiveness of our method through a comparison with a recent alternating optimization strategy dealing with the exact l(1)/l(2) term, on an application to seismic data blind deconvolution.
引用
收藏
页码:539 / 543
页数:5
相关论文
共 50 条
  • [11] SPARSE REPRESENTATION LEARNING OF DATA BY AUTOENCODERS WITH L1/2 REGULARIZATION
    Li, F.
    Zurada, J. M.
    Wu, W.
    NEURAL NETWORK WORLD, 2018, 28 (02) : 133 - 147
  • [12] SPARSE BLIND DECONVOLUTION BASED ON SCALE INVARIANT SMOOTHED l0-NORM
    Nose-Filho, Kenji
    Jutten, Christian
    Romano, Joao M. T.
    2014 PROCEEDINGS OF THE 22ND EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2014, : 461 - 465
  • [13] Sparse smooth group L0°L1/2 regularization method for convolutional neural networks
    Quasdane, Mohamed
    Ramchoun, Hassan
    Masrour, Tawfik
    KNOWLEDGE-BASED SYSTEMS, 2024, 284
  • [14] A HIERARCHICAL SPARSITY-SMOOTHNESS BAYESIAN MODEL FOR l0 + l1 + l2 REGULARIZATION
    Chaari, Lotfi
    Batatia, Hadj
    Dobigeon, Nicolas
    Tourneret, Jean-Yves
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [15] Improved sparse reconstruction for fluorescence molecular tomography with L1/2 regularization
    Guo, Hongbo
    Yu, Jingjing
    He, Xiaowei
    Hou, Yuqing
    Dong, Fang
    Zhang, Shuling
    BIOMEDICAL OPTICS EXPRESS, 2015, 6 (05): : 1648 - 1664
  • [16] A necessary and sufficient condition for sparse vector recovery via l1 - l2 minimization
    Bi, Ning
    Tang, Wai-Shing
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2022, 56 : 337 - 350
  • [17] k Block Sparse Vector Recovery via Block l1 - l2 Minimization
    Xie, Shaohua
    Liang, Kaihao
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2023, 42 (05) : 2897 - 2915
  • [18] Accelerated schemes for the L1/L2 minimization
    Wang C.
    Yan M.
    Rahimi Y.
    Lou Y.
    IEEE Transactions on Signal Processing, 2020, 68 : 2660 - 2669
  • [19] A new sufficient condition for sparse vector recovery via l1 - l2 local minimization
    Bi, Ning
    Tan, Jun
    Tang, Wai-Shing
    ANALYSIS AND APPLICATIONS, 2021, 19 (06) : 1019 - 1031
  • [20] A new Sigma-Pi-Sigma neural network based on L1 and L2 regularization and applications
    Jiao, Jianwei
    Su, Keqin
    AIMS MATHEMATICS, 2024, 9 (03): : 5995 - 6012