Euclid in a Taxicab: Sparse Blind Deconvolution with Smoothed l1/l2 Regularization

被引:78
|
作者
Repetti, Audrey [1 ]
Mai Quyen Pham [1 ,2 ]
Duval, Laurent [2 ]
Chouzenoux, Emilie [1 ]
Pesquet, Jean-Christophe [1 ]
机构
[1] Univ Paris Est, LIGM UMR CNRS 8049, F-77454 Champs Sur Marne, France
[2] IFP Energies Nouvelles, F-92500 Rueil Malmaison, France
关键词
Blind deconvolution; nonconvex optimization; norm ratio; preconditioned forward-backward algorithm; seismic data processing; sparsity; smoothed l(1)/l(2) regularization; COORDINATE DESCENT METHOD; NONNEGATIVE MATRIX; FACTORIZATION; CONVERGENCE; SIGNALS;
D O I
10.1109/LSP.2014.2362861
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The l(1)/l(2) ratio regularization function has shown good performance for retrieving sparse signals in a number of recent works, in the context of blind deconvolution. Indeed, it benefits from a scale invariance property much desirable in the blind context. However, the l(1)/l(2) function raises some difficulties when solving the nonconvex and nonsmooth minimization problems resulting from the use of such a penalty term in current restoration methods. In this paper, we propose a new penalty based on a smooth approximation to the l(1)/l(2) function. In addition, we develop a proximal-based algorithm to solve variational problems involving this function and we derive theoretical convergence results. We demonstrate the effectiveness of our method through a comparison with a recent alternating optimization strategy dealing with the exact l(1)/l(2) term, on an application to seismic data blind deconvolution.
引用
收藏
页码:539 / 543
页数:5
相关论文
共 50 条
  • [41] Deconvolution using Fourier transform phase, l1and l2 balls, and filtered variation
    Yorulmaz, Onur
    Cetin, A. Enis
    DIGITAL SIGNAL PROCESSING, 2018, 77 : 36 - 42
  • [42] ONLINE AND OFFLINE EFFECTS OF L1 PRACTICE IN L2 GRAMMAR LEARNING A PARTIAL REPLICATION
    McManus, Kevin
    Marsden, Emma
    STUDIES IN SECOND LANGUAGE ACQUISITION, 2018, 40 (02) : 459 - 475
  • [43] ANALYSIS AND ALGORITHMS FOR SOME COMPRESSED SENSING MODELS BASED ON L1/L2 MINIMIZATION
    Zeng, Liaoyuan
    Yu, Peiran
    Pong, Ting Kei
    SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (02) : 1576 - 1603
  • [44] On Partial Smoothness, Activity Identification and Faster Algorithms of L1 Over L2 Minimization
    Tao, Min
    Zhang, Xiao-Ping
    Xia, Zi-Hao
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 2874 - 2889
  • [45] A novel l1/2 sparse regression method for hyperspectral unmixing
    Sun, Le
    Wu, Zebin
    Xiao, Liang
    Liu, Jianjun
    Wei, Zhihui
    Dang, Fuxing
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2013, 34 (20) : 6983 - 7001
  • [46] Feature Selection With l2,1-2 Regularization
    Shi, Yong
    Miao, Jianyu
    Wang, Zhengyu
    Zhang, Peng
    Niu, Lingfeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (10) : 4967 - 4982
  • [47] Structural damage detection based on iteratively reweighted l1 regularization algorithm
    Zhou, Xiaoqing
    Hou, Rongrong
    Wu, Yuhan
    ADVANCES IN STRUCTURAL ENGINEERING, 2019, 22 (06) : 1479 - 1487
  • [48] ON l1 DATA FITTING AND CONCAVE REGULARIZATION FOR IMAGE RECOVERY
    Nikolova, Mila
    Ng, Michael K.
    Tam, Chi-Pan
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2013, 35 (01) : A397 - A430
  • [49] Representer Theorems for Sparsity-Promoting l1 Regularization
    Unser, Michael
    Fageot, Julien
    Gupta, Harshit
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (09) : 5167 - 5180
  • [50] On Recovery of Sparse Signals Via l1 Minimization
    Cai, T. Tony
    Xu, Guangwu
    Zhang, Jun
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2009, 55 (07) : 3388 - 3397