Gradient-Tracking-Based Distributed Optimization With Guaranteed Optimality Under Noisy Information Sharing

被引:10
作者
Wang, Yongqiang [1 ]
Basar, Tamer [2 ]
机构
[1] Clemson Univ, Dept Elect & Comp Engn, Clemson, SC 29634 USA
[2] Univ Illinois, Coordinated Sci Lab, Urbana, IL 61801 USA
基金
美国国家科学基金会;
关键词
Distributed optimization; gradient tracking; information-sharing noise; stochastic gradient methods; CONVERGENCE; NETWORKS; ADMM;
D O I
10.1109/TAC.2022.3212006
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Distributed optimization enables networked agents to cooperatively solve a global optimization problem. Despite making significant inroads, most existing results on distributed optimization rely on noise-free information sharing among the agents, which is problematic when communication channels are noisy, messages are coarsely quantized, or shared information are obscured by additive noise for the purpose of achieving differential privacy. The problem of information-sharing noise is particularly pronounced in the state-of-the-art gradient-tracking-based distributed optimization algorithms, in that information-sharing noise will accumulate with iterations on the gradient-tracking estimate of these algorithms, and the ensuing variance will even grow unbounded when the noise is persistent. This article proposes a new gradient-tracking-based distributed optimization approach that can avoid information-sharing noise from accumulating in the gradient estimation. The approach is applicable even when the interagent interaction is time-varying, which is key to enable the incorporation of a decaying factor in interagent interaction to gradually eliminate the influence of information-sharing noise. In fact, we rigorously prove that the proposed approach can ensure the almost sure convergence of all agents to the same optimal solution even in the presence of persistent information-sharing noise. The approach is applicable to general directed graphs. It is also capable of ensuring the almost sure convergence of all agents to an optimal solution when the gradients are noisy, which is common in machine learning applications. Numerical simulations confirm the effectiveness of the proposed approach.
引用
收藏
页码:4796 / 4811
页数:16
相关论文
共 48 条
  • [1] [Anonymous], 2013, Error-correction coding for digital communications
  • [2] Distributed Spectrum Sensing for Cognitive Radio Networks by Exploiting Sparsity
    Bazerque, Juan Andres
    Giannakis, Georgios B.
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (03) : 1847 - 1862
  • [3] Cao X., DECENTRALIZED UNPUB
  • [4] Decentralized online convex optimization based on signs of relative states
    Cao, Xuanyu
    Basar, Tamer
    [J]. AUTOMATICA, 2021, 129
  • [5] NEXT: In-Network Nonconvex Optimization
    Di Lorenzo, Paolo
    Scutari, Gesualdo
    [J]. IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2016, 2 (02): : 120 - 136
  • [6] Convergence Rates of Distributed Gradient Methods Under Random Quantization: A Stochastic Approximation Approach
    Doan, Thinh T.
    Maguluri, Siva Theja
    Romberg, Justin
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (10) : 4469 - 4484
  • [7] George J, 2019, IEEE DECIS CONTR P, P5538, DOI [10.1109/CDC40024.2019.9029453, 10.1109/cdc40024.2019.9029453]
  • [8] Differentially Private Distributed Constrained Optimization
    Han, Shuo
    Topcu, Ufuk
    Pappas, George J.
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (01) : 50 - 64
  • [9] Differential Private Noise Adding Mechanism and Its Application on Consensus Algorithm
    He, Jianping
    Cai, Lin
    Guan, Xinping
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 4069 - 4082
  • [10] Horn RA., 1985, MATRIX ANAL, DOI 10.1017/CBO9780511810817