ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks

被引:0
|
作者
Kovalev, Dmitry [1 ]
Shulgin, Egor [1 ]
Richtarik, Peter [1 ]
Rogozin, Alexander [2 ,3 ]
Gasnikov, Alexander [2 ,3 ]
机构
[1] King Abdullah Univ Sci & Technol, Thuwal, Saudi Arabia
[2] Moscow Inst Phys & Technol, Dolgoprudnyi, Russia
[3] Higher Sch Econ, Moscow, Russia
基金
俄罗斯科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose ADOM - an accelerated method for smooth and strongly convex decentralized optimization over time-varying networks. ADOM uses a dual oracle, i.e., we assume access to the gradient of the Fenchel conjugate of the individual loss functions. Up to a constant factor, which depends on the network structure only, its communication complexity is the same as that of accelerated Nesterov gradient method (Nesterov, 2003). To the best of our knowledge, only the algorithm of Rogozin et al. (2019) has a convergence rate with similar properties. However, their algorithm converges under the very restrictive assumption that the number of network changes can not be greater than a tiny percentage of the number of iterations. This assumption is hard to satisfy in practice, as the network topology changes usually can not be controlled. In contrast, ADOM merely requires the network to stay connected throughout time.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] An Accelerated Method For Decentralized Distributed Stochastic Optimization Over Time-Varying Graphs
    Rogozin, Alexander
    Bochko, Mikhail
    Dvurechensky, Pavel
    Gasnikov, Alexander
    Lukoshkin, Vladislav
    2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 3367 - 3373
  • [2] Accelerated Gradient Tracking over Time-varying Graphs for Decentralized Optimization
    Li, Huan
    Lin, Zhouchen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [3] Decentralized optimization with affine constraints over time-varying networks
    Yarmoshik, Demyan
    Rogozin, Alexander
    Gasnikov, Alexander
    COMPUTATIONAL MANAGEMENT SCIENCE, 2024, 21 (01)
  • [4] Decentralized convex optimization on time-varying networks with application to Wasserstein barycenters
    Yufereva, Olga
    Persiianov, Michael
    Dvurechensky, Pavel
    Gasnikov, Alexander
    Kovalev, Dmitry
    COMPUTATIONAL MANAGEMENT SCIENCE, 2024, 21 (01)
  • [5] A Decentralized Prediction-Correction Method for Networked Time-Varying Convex Optimization
    Simonetto, Andrea
    Mokhtari, Aryan
    Koppel, Alec
    Leus, Geert
    Ribeiro, Alejandro
    2015 IEEE 6TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP), 2015, : 509 - 512
  • [6] Accelerated Distributed Stochastic Nonconvex Optimization Over Time-Varying Directed Networks
    Chen, Yiyue
    Hashemi, Abolfazl
    Vikalo, Haris
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2025, 70 (04) : 2196 - 2211
  • [7] Byzantine-robust decentralized stochastic optimization over static and time-varying networks
    Peng, Jie
    Li, Weiyu
    Ling, Qing
    SIGNAL PROCESSING, 2021, 183
  • [8] Decentralized Conditional Gradient Method on Time-Varying Graphs
    Vedernikov, R. A.
    Rogozin, A. V.
    Gasnikov, A. V.
    PROGRAMMING AND COMPUTER SOFTWARE, 2023, 49 (06) : 505 - 512
  • [9] Decentralized Conditional Gradient Method on Time-Varying Graphs
    R. A. Vedernikov
    A. V. Rogozin
    A. V. Gasnikov
    Programming and Computer Software, 2023, 49 : 505 - 512
  • [10] Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks
    Kovalev, Dmitry
    Gasanov, Elnur
    Gasnikov, Alexander
    Richtarik, Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34