A decentralized Nesterov gradient method for stochastic optimization over unbalanced directed networks

被引:5
|
作者
Hu, Jinhui [1 ]
Xia, Dawen [2 ]
Cheng, Huqiang [1 ]
Feng, Liping [3 ]
Ji, Lianghao [4 ]
Guo, Jing [1 ]
Li, Huaqing [1 ]
机构
[1] Southwest Univ, Chongqing Key Lab Nonlinear Circuits & Intelligen, Coll Elect & Informat Engn, 2 Tiansheng Rd, Chongqing, Peoples R China
[2] Guizhou Minzu Univ, Coll Data Sci & Informat Engn, Guiyang, Peoples R China
[3] Xinzhou Teachers Univ, Dept Comp Sci, Xinzhou, Shanxi, Peoples R China
[4] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Computat Intelligence, Chongqing, Peoples R China
基金
中国国家自然科学基金;
关键词
decentralized optimization; unbalanced directed networks; stochastic gradients; machine learning; multi‐ agent systems; TRACKING CONTROL; CONSENSUS; CONVERGENCE; ALGORITHM; SYSTEMS;
D O I
10.1002/asjc.2483
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Decentralized stochastic gradient methods play significant roles in large-scale optimization that finds many practical applications in machine learning and coordinated control. This paper studies optimization problems over unbalanced directed networks, where the mutual goal of agents in the network is to optimize a global objective function expressed as a sum of local objective functions. Each agent using only local computation and communication in the networks is assumed to get access to a stochastic first-order oracle. In order to devise a noise-tolerant decentralized algorithm with accelerated linear convergence, a decentralized Nesterov gradient algorithm with the constant step-size and parameter using stochastic gradients is proposed in this paper. The proposed algorithm employing a gradient-tracking technique is proved to converge linearly to an error ball around the optimal solution via the analysis on a linear system when the positive constant step-size and parameter are sufficiently small. We further recover the exact linear convergence for the proposed algorithm with exact gradients under the same selection conditions of the constant step-size and parameter. Some real-world data sets are used in simulations to validate the correctness of the theoretical findings and practicability of the proposed algorithm.
引用
收藏
页码:576 / 593
页数:18
相关论文
共 50 条
  • [41] Distributed nonlinear estimation over unbalanced directed networks
    Li, Xiuxian (xxli@ieee.org), 1600, Institute of Electrical and Electronics Engineers Inc. (68):
  • [42] Distributed Fixed-Point Algorithms for Dynamic Convex Optimization over Decentralized and Unbalanced Wireless Networks
    Agrawal, Navneet
    Cavalcante, Renato L. G.
    Stanczak, Slawomir
    27TH INTERNATIONAL WORKSHOP ON SMART ANTENNAS, WSA 2024, 2024, : 97 - 102
  • [43] Momentum methods for stochastic optimization over time-varying directed networks
    Cui, Zhuo-Xu
    Fan, Qibin
    Jia, Cui
    SIGNAL PROCESSING, 2020, 174
  • [44] Regularized Nesterov's accelerated damped BFGS method for stochastic optimization
    Suppalap, Siwakon
    Makmuang, Dawrawee
    Damminsed, Vipavee
    Wangkeeree, Rabian
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2025, 467
  • [45] Distributed stochastic optimization with gradient tracking over strongly-connected networks
    Xin, Ran
    Sahu, Anit Kumar
    Khan, Usman A.
    Kar, Soummya
    2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 8353 - 8358
  • [46] Decentralized Optimization Over Time-Varying Directed Graphs With Row and Column-Stochastic Matrices
    Saadatniaki, Fakhteh
    Xin, Ran
    Khan, Usman A.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2020, 65 (11) : 4769 - 4780
  • [47] A stochastic conditional gradient algorithm for decentralized online convex optimization
    Nguyen Kim Thang
    Srivastav, Abhinav
    Trystram, Denis
    Youssef, Paul
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2022, 169 : 334 - 351
  • [48] Efficient Decentralized Stochastic Gradient Descent Method for Nonconvex Finite-Sum Optimization Problems
    Zhan, Wenkang
    Wu, Gang
    Gao, Hongchang
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 9006 - 9013
  • [49] Momentum-based distributed gradient tracking algorithms for distributed aggregative optimization over unbalanced directed graphs
    Wang, Zhu
    Wang, Dong
    Lian, Jie
    Ge, Hongwei
    Wang, Wei
    AUTOMATICA, 2024, 164
  • [50] Privacy Preservation of Optimization Algorithm Over Unbalanced Directed Graph
    Yan, Jiaojiao
    Cao, Jinde
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (04): : 2164 - 2173