Convergence of Distributed Gradient-Tracking-Based Optimization Algorithms with Random Graphs

被引:0
作者
WANG Jiexiang [1 ]
FU Keli [2 ]
GU Yu [2 ]
LI Tao [2 ]
机构
[1] School of Mechatronic Engineering and Automation, Shanghai University
[2] Key Laboratory of Pure Mathematics and Mathematical Practice, School of Mathematical Sciences, East China Normal University
关键词
Distributed optimization; geometric convergence; gradient tracking; random graph;
D O I
暂无
中图分类号
O224 [最优化的数学理论];
学科分类号
070105 ; 1201 ;
摘要
This paper studies distributed convex optimization over a multi-agent system, where each agent owns only a local cost function with convexity and Lipschitz continuous gradients. The goal of the agents is to cooperatively minimize a sum of the local cost functions. The underlying communication networks are modelled by a sequence of random and balanced digraphs, which are not required to be spatially or temporally independent and have any special distributions. The authors use a distributed gradient-tracking-based optimization algorithm to solve the optimization problem. In the algorithm,each agent makes an estimate of the optimal solution and an estimate of the average of all the local gradients. The values of the estimates are updated based on a combination of a consensus method and a gradient tracking method. The authors prove that the algorithm can achieve convergence to the optimal solution at a geometric rate if the conditional graphs are uniformly strongly connected, the global cost function is strongly convex and the step-sizes don’t exceed some upper bounds.
引用
收藏
页码:1438 / 1453
页数:16
相关论文
共 50 条
  • [41] On the O(1/k) Convergence of Distributed Gradient Methods Under Random Quantization
    Dutta, Amit
    Doan, Thinh T.
    IEEE CONTROL SYSTEMS LETTERS, 2024, 8 : 2967 - 2972
  • [42] Nabla fractional distributed optimization algorithms over undirected/directed graphs
    Hong, Xiaolin
    Wei, Yiheng
    Zhou, Shuaiyu
    Yue, Dongdong
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2024, 361 (03): : 1436 - 1454
  • [43] Reactive Power Optimization for Distribution Network Based on Distributed Random Gradient-Free Algorithm
    Xie, Jun
    Liang, Chunxiang
    Xiao, Yichen
    ENERGIES, 2018, 11 (03):
  • [44] Balance of Communication and Convergence: Predefined-Time Distributed Optimization Based on Zero-Gradient-Sum
    Zhang, Renyongkang
    Guo, Ge
    Zhou, Zeng-Di
    IEEE TRANSACTIONS ON CYBERNETICS, 2025, 55 (02) : 661 - 671
  • [45] ACHIEVING GEOMETRIC CONVERGENCE FOR DISTRIBUTED OPTIMIZATION OVER TIME-VARYING GRAPHS
    Nedic, Angelia
    Olshevsky, Alex
    Shi, Wei
    SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (04) : 2597 - 2633
  • [46] Distributed Big-Data Optimization via Blockwise Gradient Tracking
    Notarnicola, Ivano
    Sun, Ying
    Scutari, Gesualdo
    Notarstefano, Giuseppe
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (05) : 2045 - 2060
  • [47] Accelerated Gradient Tracking over Time-varying Graphs for Decentralized Optimization
    Li, Huan
    Lin, Zhouchen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [48] Linear convergence of primal-dual gradient methods and their performance in distributed optimization
    Alghunaim, Sulaiman A.
    Sayed, Ali H.
    AUTOMATICA, 2020, 117
  • [49] An Improved Distributed Nesterov Gradient Tracking Algorithm for Smooth Convex Optimization Over Directed Networks
    Lin, Yifu
    Li, Wenling
    Zhang, Bin
    Du, Junping
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2025, 70 (04) : 2738 - 2745
  • [50] Convergence Rates of Distributed Gradient Methods Under Random Quantization: A Stochastic Approximation Approach
    Doan, Thinh T.
    Maguluri, Siva Theja
    Romberg, Justin
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (10) : 4469 - 4484