AN ADAPTIVE l1-l2-TYPE MODEL WITH HIERARCHIES FOR SPARSE SIGNAL RECONSTRUCTION PROBLEM

被引:0
|
作者
Ding, Yanyun [1 ]
Yue, Zhixiao [2 ]
Zhang, Haibin [1 ]
机构
[1] Beijing Univ Technol, Dept Operat Res & Informat Engn, Beijing 100124, Peoples R China
[2] Southern Univ Sci & Technol, Dept Math, Shenzhen 518055, Peoples R China
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2022年 / 18卷 / 04期
关键词
convex optimization; sparse signal reconstruction; hierarchical optimization; l(p-) l(1 )-l(2)minimization; alternating direction method of multipliers; MINIMIZATION; REGRESSION; MULTIPLIERS; EFFICIENT; ALGORITHM; SELECTION;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
This paper addresses solving an adaptive l(1)-l(2) regularized model in the framework of hierarchical convex optimization for sparse signal reconstruction. This is realized in the framework of bi-level convex optimization, we can also turn the challenging bi-level model into a single-level constrained optimization problem through some priori information. The l(1)-l(2 )norm regularized least-square sparse optimization is also called the elastic net problem, and numerous simulation and real-world data show that the elastic net often outperforms the Lasso. However, the elastic net is suitable for handling Gaussian noise in most cases. In this paper, we propose an adaptive and robust model for reconstructing sparse signals, say l(p-)l(1)-l(2), where the l(p)-norm with p >= 1 measures the data fidelity and l(1)-l(2)-term measures the sparsity. This model is robust and flexible in the sense of having the ability to deal with different types of noises. To solve this model, we employ an alternating direction method of multipliers (ADMM) based on introducing one or a pair of auxiliary variables. From the point of view of numerical computation, we use numerical experiments to demonstrate that both of our proposed model and algorithms outperform the Lasso model solved by ADMM on sparse signal reconstruction problem.
引用
收藏
页码:695 / 712
页数:18
相关论文
共 50 条
  • [31] Intelligent greedy pursuit model for sparse reconstruction based on l0 minimization
    Li, Dan
    Wang, Qiang
    Shen, Yi
    SIGNAL PROCESSING, 2016, 122 : 138 - 151
  • [32] Fixed-Time Stable Neurodynamic Flow to Sparse Signal Recovery via Nonconvex L1-β2-Norm
    Zhao, You
    Liao, Xiaofeng
    He, Xing
    NEURAL COMPUTATION, 2022, 34 (08) : 1727 - 1755
  • [33] Smoothed l1-regularization-based line search for sparse signal recovery
    Yang, Zhimin
    Chai, Yi
    Chen, Tao
    Qu, Jianfeng
    SOFT COMPUTING, 2017, 21 (16) : 4813 - 4828
  • [34] Block sparse recovery via mixed l2/l1 minimization
    Jun Hong Lin
    Song Li
    Acta Mathematica Sinica, English Series, 2013, 29 : 1401 - 1412
  • [35] Sparse minimal learning machines via l1/2 norm regularization
    Dias, Madson L. D.
    Freire, Ananda L.
    Souza Junior, Amauri H.
    da Rocha Neto, Ajalmar R.
    Gomes, Joao P. P.
    2018 7TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2018, : 206 - 211
  • [36] Performance guarantees of regularized l1-2-minimization for robust sparse recovery
    Wang, Wendong
    Zhang, Jing
    SIGNAL PROCESSING, 2022, 201
  • [37] Sparse Functional Link Adaptive Filter Using an l1-Norm Regularization
    Comminiello, Danilo
    Scarpiniti, Michele
    Scardapane, Simone
    Uncini, Aurelio
    2018 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2018,
  • [38] A Novel Projection Neural Network for Sparse Optimization With L1-Minimization Problem
    Wen, Hongsong
    He, Xing
    Huang, Tingwen
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (05): : 3339 - 3351
  • [39] Subband Adaptive Filtering with l1-Norm Constraint for Sparse System Identification
    Choi, Young-Seok
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2013, 2013
  • [40] Source reconstruction for bioluminescence tomography via L1/2 regularization
    Yu, Jingjing
    Li, Qiyue
    Wang, Haiyu
    JOURNAL OF INNOVATIVE OPTICAL HEALTH SCIENCES, 2018, 11 (02)