Neurodynamic Algorithms With Finite/Fixed-Time Convergence for Sparse Optimization via l1 Regularization

被引:6
|
作者
Wen, Hongsong [1 ]
He, Xing [1 ]
Huang, Tingwen [2 ]
Yu, Junzhi [3 ,4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing Key Lab Nonlinear Circuits & Intelligent, Chongqing 400715, Peoples R China
[2] Texas A&M Univ Qatar, Dept Math, Doha, Qatar
[3] Peking Univ, Coll Engn, Dept Adv Mfg & Robot, State Key Lab Turbulence & Complex Syst, Beijing 100871, Peoples R China
[4] Peking Univ, Nanchang Innovat Inst, Nanchang 330224, Peoples R China
来源
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS | 2024年 / 54卷 / 01期
关键词
l(1) regularization; finite-time stability (FTS); fixed-time stability (FxTS); locally competitive algorithm (LCA); sparse optimization; SIGNAL RECONSTRUCTION; RECOVERY; PROJECTION;
D O I
10.1109/TSMC.2023.3304850
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse optimization problems have been successfully applied to a wide range of research areas, and useful insights and elegant methods for proving the stability and convergence of neurodynamic algorithms have been yielded in previous work. This article develops several neurodynamic algorithms for sparse signal recovery by solving the l(1) regularization problem. First, in the framework of the locally competitive algorithm (LCA), modified LCA (MLCA) with finite-time convergence and MLCA with fixed-time convergence are designed. Then, the sliding-mode control (SMC) technique is introduced and modified, i.e., modified SMC (MSMC), which is combined with LCA to design MSMC-LCA with finite-time convergence and MSMC-LCA with fixed-time convergence. It is shown that the solutions of the proposed neurodynamic algorithms exist and are unique under the observation matrix satisfying restricted isometry property (RIP) condition, while finite-time or fixed-time convergence to the optimal points is shown via Lyapunov-based analysis. In addition, combining the notions of finite-time stability (FTS) and fixed-time stability (FxTS), upper bounds on the convergence time of the proposed neurodynamic algorithms are given, and the convergence results obtained for the MLCA and MSMC-LCA with fixed-time convergence are shown to be independent of the initial conditions. Finally, simulation experiments of signal recovery and image recovery are carried out to demonstrate the superior performance of the proposed neurodynamic algorithms.
引用
收藏
页码:131 / 142
页数:12
相关论文
共 24 条
  • [1] Time-varying neurodynamic optimization approaches with fixed-time convergence for sparse signal reconstruction
    Ju, Xingxing
    Yang, Xinsong
    Qing, Linbo
    Cao, Jinde
    Wang, Dianwei
    NEUROCOMPUTING, 2024, 597
  • [2] FPGA Implementation for Finite-Time and Fixed-Time Neurodynamic Algorithms in Constrained Optimization Problems
    Zhang, Jiahao
    He, Xing
    Zhao, Gui
    Huang, Tingwen
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2023, 70 (09) : 3584 - 3597
  • [3] Matrix Neurodynamic Approaches for Rank Minimization: Finite/Fixed-Time Convergence Technique
    Zhang, Meng
    He, Xing
    Huang, Tingwen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [4] A Simple Neural Network for Sparse Optimization With l1 Regularization
    Ma, Litao
    Bian, Wei
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (04): : 3430 - 3442
  • [5] Projection neural networks with finite-time and fixed-time convergence for sparse signal reconstruction
    Xu, Jing
    Li, Chuandong
    He, Xing
    Zhang, Xiaoyu
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (01) : 425 - 443
  • [6] A fixed-time converging neurodynamic approach with time-varying coefficients for l1-minimization problem
    Xu, Jing
    Li, Chuandong
    He, Xing
    Wen, Hongsong
    Zhang, Xiaoyu
    INFORMATION SCIENCES, 2024, 654
  • [7] A Fixed-Time Proximal Gradient Neurodynamic Network With Time-Varying Coefficients for Composite Optimization Problems and Sparse Optimization Problems With Log-Sum Function
    Xu, Jing
    Li, Chuandong
    He, Xing
    Wen, Hongsong
    Ju, Xingxing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [8] Centralized and Collective Neurodynamic Optimization Approaches for Sparse Signal Reconstruction via L1Y-Minimization
    Zhao, You
    Liao, Xiaofeng
    He, Xing
    Tang, Rongqiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (12) : 7488 - 7501
  • [9] Iteratively Reweighted l1 Approaches to Sparse Composite Regularization
    Ahmad, Rizwan
    Schniter, Philip
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2015, 1 (04) : 220 - 235
  • [10] Make l1 regularization effective in training sparse CNN
    He, Juncai
    Jia, Xiaodong
    Xu, Jinchao
    Zhang, Lian
    Zhao, Liang
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2020, 77 (01) : 163 - 182