MINIMIZATION OF TRANSFORMED L1 PENALTY: CLOSED FORM REPRESENTATION AND ITERATIVE THRESHOLDING ALGORITHMS

被引:46
|
作者
Zhang, Shuai [1 ]
Xin, Jack [1 ]
机构
[1] Univ Calif Irvine, Dept Math, Irvine, CA 92697 USA
基金
美国国家科学基金会;
关键词
Transformed l(1) penalty; closed form thresholding functions; iterative thresholding algorithms; compressed sensing; robust recovery; VARIABLE SELECTION; REGULARIZATION; RECOVERY; SPARSITY;
D O I
10.4310/CMS.2017.v15.n2.a9
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The transformed l(1) penalty (TL1) functions are a one parameter family of bilinear transformations composed with the absolute value function. When acting on vectors, the TL1 penalty interpolates l(0) and l(1) similar to lp norm, where p is in (0,1). In our companion paper, we showed that TL1 is a robust sparsity promoting penalty in compressed sensing (CS) problems for a broad range of incoherent and coherent sensing matrices. Here we develop an explicit fixed point representation for the TL1 regularized minimization problem. The TL1 thresholding functions are in closed form for all parameter values. In contrast, the lp thresholding functions (p is in [0,1]) are in closed form only for p= 0,1,1/2, 2/3, known as hard, soft, half, and 2/3 thresholding respectively. The TL1 threshold values differ in subcritical (supercritical) parameter regime where the TL1 threshold functions are continuous (discontinuous) similar to soft-thresholding (half-thresholding) functions. We propose TL1 iterative thresholding algorithms and compare them with hard and half thresholding algorithms in CS test problems. For both incoherent and coherent sensing matrices, a proposed TL1 iterative thresholding algorithm with adaptive subcritical and supercritical thresholds (TL1IT-s1 for short), consistently performs the best in sparse signal recovery with and without measurement noise.
引用
收藏
页码:511 / 537
页数:27
相关论文
共 50 条
  • [21] A SIMPLER APPROACH TO WEIGHTED l1 MINIMIZATION
    Krishnaswamy, Anilesh K.
    Oymak, Samet
    Hassibi, Babak
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 3621 - 3624
  • [22] Sensitivity of l1 minimization to parameter choice
    Berk, Aaron
    Plan, Yaniv
    Yilmaz, Ozgur
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2021, 10 (02) : 397 - 453
  • [23] Selective l1 Minimization for Sparse Recovery
    Van Luong Le
    Lauer, Fabien
    Bloch, Gerard
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2014, 59 (11) : 3008 - 3013
  • [24] Enhancing Sparsity by Reweighted l1 Minimization
    Candes, Emmanuel J.
    Wakin, Michael B.
    Boyd, Stephen P.
    JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2008, 14 (5-6) : 877 - 905
  • [25] SUPPORT DRIVEN REWEIGHTED l1 MINIMIZATION
    Mansour, Hassan
    Yilmaz, Oezguer
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 3309 - 3312
  • [26] Breaking through the Thresholds: an Analysis for Iterative Reweighted l1 Minimization via the Grassmann Angle Framework
    Xu, Weiyu
    Khajehnejad, M. Amin
    Avestimehr, A. Salman
    Hassibi, Babak
    2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 5498 - 5501
  • [27] Accelerated schemes for the L1/L2 minimization
    Wang C.
    Yan M.
    Rahimi Y.
    Lou Y.
    IEEE Transactions on Signal Processing, 2020, 68 : 2660 - 2669
  • [28] Tangent vector-based gradient method with l1/2-regularization: Iterative half thresholding algorithm for CS-MRI
    Qureshi, M.
    Inam, O.
    Qazi, S. A.
    Aslam, I.
    Omer, H.
    JOURNAL OF MAGNETIC RESONANCE, 2021, 333
  • [29] Affine Phase Retrieval for Sparse Signals via L1 Minimization
    Huang, Meng
    Sun, Shixiang
    Xu, Zhiqiang
    JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2023, 29 (03)
  • [30] A Constrained l1 Minimization Approach to Sparse Precision Matrix Estimation
    Cai, Tony
    Liu, Weidong
    Luo, Xi
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2011, 106 (494) : 594 - 607