ON THE DIVERGENCE OF DECENTRALIZED NONCONVEX OPTIMIZATION

被引:4
|
作者
Hong, M. I. N. G. Y. I. [1 ]
Zeng, S. I. L. I. A. N. G. [1 ]
Zhang, J. U. N. Y. U. [2 ]
Sun, H. A. O. R. A. N. [1 ]
机构
[1] Univ Minnesota Twin Cities, Dept ECE, Minneapolis, MN 55455 USA
[2] Natl Univ Singapore, Dept ISEM, Singapore 119007, Singapore
关键词
decentralized optimization; nonconvex problems; Lipschitz continuous gradient; DISTRIBUTED OPTIMIZATION; CONVERGENCE; ALGORITHM; STRATEGIES;
D O I
10.1137/20M1353149
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
agents jointly optimize the nonconvex objective function f(u) := 1/N\sumN Abstract. In this work, we study a generic class of decentralized algorithms in which N i=1 fi(u), while only communicating with their neighbors. This class of problems has become popular in modeling many signal processing and decentralized machine learning applications, and efficient algorithms have been proposed for such a type of problem. However, most of the existing decentralized algorithms require that the local function gradients V fi's as well as the average function gradient Vf are Lipschitz, that is, the local Lipschitz conditions (LLC) and global Lipschitz condition (GLC) are satisfied. In this work, we first demonstrate the importance of the above Lipschitzness assumptions on the state-of-the-art decentralized algorithms. First, by constructing a series of examples, we show that when the LLC on the local function gradient V fi's are not satisfied, a number of state-of-the-art decentralized algorithms diverge, even if the global Lipschitz condition (GLC) still holds. This observation brings out a fundamental theoretical issue of the existing decentralized algorithms---their convergence conditions are strictly stronger than centralized algorithms such as the gradient descent, which only requires the GLC. Our observation raises an important open question: How to design decentralized algorithms when the LLC, or even the GLC, is not satisfied? To address this question, we design two first-order algorithms, which are capable of computing stationary solutions of the original problem with neither the LLC nor the GLC condition. In particular, we show that the proposed algorithms converge sublinearly to a certain \epsilon -stationary solution, where the precise rate depends on various algorithmic and problem parameters. In particular, if the local function fi's are lower bounded Qth order polynomials, then the rate becomes 0(1/\epsilonQ-1) for Q \geq 2 (where the 0 notation hides some constants such as dependency on the network topology). Such a rate is tight for the special case of Q = 2 where each fi satisfies LLC. To our knowledge, this is the first attempt that studies decentralized nonconvex optimization problems with neither the LLC nor the GLC.
引用
收藏
页码:2879 / 2908
页数:30
相关论文
共 50 条
  • [31] A Penalty Alternating Direction Method of Multipliers for Convex Composite Optimization Over Decentralized Networks
    Zhang, Jiaojiao
    Liu, Huikang
    Sow, Anthony Man-Cho
    Ling, Qing
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 4282 - 4295
  • [32] Decomposition of Nonconvex Optimization via Bi-Level Distributed ALADIN
    Engelmann, Alexander
    Jiang, Yuning
    Houska, Boris
    Faulwasser, Timm
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2020, 7 (04): : 1848 - 1858
  • [33] Distributed Global Optimization for a Class of Nonconvex Optimization With Coupled Constraints
    Ren, Xiaoxing
    Li, Dewei
    Xi, Yugeng
    Shao, Haibin
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (08) : 4322 - 4329
  • [34] Decentralized Stochastic Optimization With Pairwise Constraints and Variance Reduction
    Han, Fei
    Cao, Xuanyu
    Gong, Yi
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1960 - 1973
  • [35] Communication-Censored ADMM for Decentralized Consensus Optimization
    Liu, Yaohua
    Xu, Wei
    Wu, Gang
    Tian, Zhi
    Ling, Qing
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (10) : 2565 - 2579
  • [36] A Flexible Framework for Decentralized Composite Optimization with Compressed Communication
    Chang, Zhongyi
    Zhang, Zhen
    Yang, Shaofu
    Cao, Jinde
    FRACTAL AND FRACTIONAL, 2024, 8 (12)
  • [37] AN OPTIMAL ALGORITHM FOR DECENTRALIZED FINITE-SUM OPTIMIZATION
    Hendrikx, Hadrien
    Bach, Francis
    Massoulie, Laurent
    SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (04) : 2753 - 2783
  • [38] Fast-Converging Decentralized ADMM for Consensus Optimization
    He, Jeannie
    Xiao, Ming
    Skoglund, Mikael
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 575 - 580
  • [39] Expander Graph and Communication-Efficient Decentralized Optimization
    Chow, Yat-Tin
    Shi, Wei
    Wu, Tianyu
    Yin, Wotao
    2016 50TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, 2016, : 1715 - 1720
  • [40] Accelerated Dual Averaging Methods for Decentralized Constrained Optimization
    Liu, Changxin
    Shi, Yang
    Li, Huiping
    Du, Wenli
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (04) : 2125 - 2139