A Distributed Nesterov-Like Gradient Tracking Algorithm for Composite Constrained Optimization

被引:3
|
作者
Zheng, Lifeng [1 ]
Li, Huaqing [1 ]
Li, Jun [1 ]
Wang, Zheng [2 ]
Lu, Qingguo [3 ,4 ]
Shi, Yawei [1 ]
Wang, Huiwei [1 ]
Dong, Tao [1 ]
Ji, Lianghao [5 ,6 ]
Xia, Dawen [7 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing Key Lab Nonlinear Circuits & Intelligen, Chongqing 400715, Peoples R China
[2] Univ New South Wales, Sch Elect Engn & Telecommun, Sydney, NSW 2052, Australia
[3] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
[4] Minist Educ, Key Lab Ind Internet Things & Networked Control, Beijing, Peoples R China
[5] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Image Cognit, Chongqing 400000, Peoples R China
[6] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Computat Intelligence, Chongqing 400000, Peoples R China
[7] Guizhou Minzu Univ, Coll Data Sci & Informat Engn, Guiyang 550025, Peoples R China
来源
IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS | 2023年 / 9卷
基金
中国国家自然科学基金;
关键词
Successive convex approximation (SCA); nonconvex optimization; Nesterov method; gradient tracking; distributed optimization; AVERAGE CONSENSUS; CONVERGENCE;
D O I
10.1109/TSIPN.2023.3239698
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper focuses on the constrained optimization problem where the objective function is composed of smooth (possibly nonconvex) and nonsmooth parts. The proposed algorithm integrates the successive convex approximation (SCA) technique with the gradient tracking mechanism that aims at achieving a linear convergence rate and employing the momentum term to regulate update directions in each time instant. It is proved that the proposed algorithm converges provided that the constant step size and momentum parameter are lower than the given upper bounds. When the smooth part is strongly convex, the proposed algorithm linearly converges to the global optimal solution, whereas it converges to a local stationary solution with a sub-linear convergence rate if the smooth part is nonconvex. Numerical simulations are applied to demonstrate the validity of the proposed algorithm and the theoretical analysis.
引用
收藏
页码:60 / 73
页数:14
相关论文
共 50 条
  • [1] A Nesterov-Like Gradient Tracking Algorithm for Distributed Optimization Over Directed Networks
    Lu, Qingguo
    Liao, Xiaofeng
    Li, Huaqing
    Huang, Tingwen
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (10): : 6258 - 6270
  • [2] Convergence Rates of Distributed Nesterov-Like Gradient Methods on Random Networks
    Jakovetic, Dusan
    Freitas Xavier, Joao Manuel
    Moura, Jose M. F.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2014, 62 (04) : 868 - 882
  • [3] An Improved Distributed Nesterov Gradient Tracking Algorithm for Smooth Convex Optimization Over Directed Networks
    Lin, Yifu
    Li, Wenling
    Zhang, Bin
    Du, Junping
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2025, 70 (04) : 2738 - 2745
  • [4] Zeroth-order Gradient Tracking for Distributed Constrained Optimization
    Cheng, Songsong
    Yu, Xin
    Fan, Yuan
    Xiao, Gaoxi
    IFAC PAPERSONLINE, 2023, 56 (02): : 5197 - 5202
  • [5] A stochastic gradient tracking algorithm with adaptive momentum for distributed optimization
    Li, Yantao
    Hu, Hanqing
    Zhang, Keke
    Lu, Qingguo
    Deng, Shaojiang
    Li, Huaqing
    NEUROCOMPUTING, 2025, 637
  • [6] S-DIGing: A Stochastic Gradient Tracking Algorithm for Distributed Optimization
    Li, Huaqing
    Zheng, Lifeng
    Wang, Zheng
    Yan, Yu
    Feng, Liping
    Guo, Jing
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (01): : 53 - 65
  • [7] A Snapshot Gradient Tracking for Distributed Optimization over Digraphs
    Che, Keqin
    Yang, Shaofu
    ARTIFICIAL INTELLIGENCE, CICAI 2022, PT III, 2022, 13606 : 348 - 360
  • [8] Distributed Nesterov Gradient and Heavy-Ball Double Accelerated Asynchronous Optimization
    Li, Huaqing
    Cheng, Huqiang
    Wang, Zheng
    Wu, Guo-Cheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (12) : 5723 - 5737
  • [9] A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration
    Sun, Bihao
    Hu, Jinhui
    Xia, Dawen
    Li, Huaqing
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2021, 22 (11) : 1463 - 1476
  • [10] Asynchronous distributed algorithm for constrained optimization and its application
    Wang, Ting
    Li, Zhongmei
    Nie, Rong
    Du, Wenli
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2025, 68 (06)