On Partial Smoothness, Activity Identification and Faster Algorithms of L1 Over L2 Minimization

被引:0
|
作者
Tao, Min [1 ]
Zhang, Xiao-Ping [2 ]
Xia, Zi-Hao [3 ]
机构
[1] Nanjing Univ, Dept Math, Natl Key Lab Novel Software Technol, Nanjing 210093, Peoples R China
[2] Tsinghua Univ, Shenzhen Int Grad Sch, Shenzhen Key Lab Ubiquitous Data Enabling, Shenzhen 518055, Peoples R China
[3] Nanjing Univ, Dept Math, Nanjing 210093, Peoples R China
关键词
Manifolds; Minimization; Signal processing algorithms; Convergence; Newton method; Vectors; Linear programming; Sparse recovery; partly smooth; prox-regularity; active set; nonsmooth analysis; identifiable surface; SPARSE REPRESENTATION; L(2) NORMS; OPTIMIZATION; REGULARIZATION; CONSTRAINTS; SIGNAL; RATIO; L(1);
D O I
10.1109/TSP.2024.3404250
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The L-1 /L-2 norm ratio arose as a sparseness measure and attracted a considerable amount of attention due tothree merits: (i) sharper approximations ofL0compared to theL(1); (ii) parameter-free and scale-invariant; (iii) more attractive than L-1 under highly-coherent matrices. In this paper, we firstestablish the partly smooth property of L-1 ove rL(2) minimizationrelative to an active manifold M and also demonstrate itsprox-regularity property. Second, we reveal that AD M Mp(orADMM+p) can identify the active manifold within a finite itera-tions. This discovery contributes to a deeper understanding of the optimization landscape associated with L-1 over L-2 minimization. Third, we propose a novel heuristic algorithm framework that combines ADMMp(or ADMM+p) with a globalized semismooth Newton method tailored for the active manifold M. This hybrid approach leverages the strengths of both methods to enhance convergence. Finally, through extensive numerical simulations, we show case the superiority of our heuristic algorithm over existing state-of-the-art methods for sparse recovery.
引用
收藏
页码:2874 / 2889
页数:16
相关论文
共 50 条
  • [1] Sorted L1/L2 Minimization for Sparse Signal Recovery
    Wang, Chao
    Yan, Ming
    Yu, Junjie
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 99 (02)
  • [2] ANALYSIS AND ALGORITHMS FOR SOME COMPRESSED SENSING MODELS BASED ON L1/L2 MINIMIZATION
    Zeng, Liaoyuan
    Yu, Peiran
    Pong, Ting Kei
    SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (02) : 1576 - 1603
  • [3] Study on L1 over L2 Minimization for Nonnegative Signal Recovery
    Tao, Min
    Zhang, Xiao-Ping
    JOURNAL OF SCIENTIFIC COMPUTING, 2023, 95 (03)
  • [4] Sparse portfolio optimization via l1 over l2 regularization
    Wu, Zhongming
    Sun, Kexin
    Ge, Zhili
    Allen-Zhao, Zhihua
    Zeng, Tieyong
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2024, 319 (03) : 820 - 833
  • [5] On the implementation of ADMM with dynamically configurable parameter for the separable l1/l2 minimization
    Wang, Jun
    Ma, Qiang
    OPTIMIZATION LETTERS, 2025, 19 (01) : 85 - 102
  • [6] Limited-Angle CT Reconstruction via the L1/L2 Minimization
    Wang, Chao
    Tao, Min
    Nagy, James G.
    Lou, Yifei
    SIAM JOURNAL ON IMAGING SCIENCES, 2021, 14 (02): : 749 - 777
  • [7] New Restricted Isometry Property Analysis for l1 - l2 Minimization Methods
    Ge, Huanmin
    Chen, Wengu
    Ng, Michael K.
    SIAM JOURNAL ON IMAGING SCIENCES, 2021, 14 (02): : 530 - 557
  • [8] Sparse recovery by the iteratively reweighted l1 algorithm for elastic l2 - lq minimization
    Zhang, Yong
    Ye, WanZhou
    OPTIMIZATION, 2017, 66 (10) : 1677 - 1687
  • [9] Block sparse recovery via mixed l2/l1 minimization
    Jun Hong Lin
    Song Li
    Acta Mathematica Sinica, English Series, 2013, 29 : 1401 - 1412
  • [10] A nonconvex l1(l1 - l2) model for image restoration with impulse noise
    Liu, Jingjing
    Ni, Anqi
    Ni, Guoxi
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2020, 378