DISTRIBUTED RELATIVELY SMOOTH OPTIMIZATION

被引:1
作者
Jegnell, Sofia [1 ]
Vlaski, Stefan [1 ]
机构
[1] Imperial Coll, Dept Elect & Elect Engn, London, England
来源
2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC) | 2022年
关键词
Distributed learning; relative smoothness; federated learning; mirror descent; stochastic optimization; 1ST-ORDER METHODS; ALGORITHM;
D O I
10.1109/CDC51059.2022.9992936
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Smoothness conditions, either on the cost itself or its gradients, are ubiquitous in the development and study of gradient-based algorithms for optimization and learning. In the context of distributed optimization and multi-agent systems, smoothness conditions and gradient bounds are additionally central to controlling the effect of local heterogeneity. We deviate from this paradigm and study distributed learning problems in relatively smooth environments, where cost functions may grow faster than a quadratic, and gradients need not be bounded. We generalize gradient noise conditions to cover this setting, and present convergence guarantees in relatively smooth and relatively convex environments. Numerical results corroborate the findings.
引用
收藏
页码:6511 / 6517
页数:7
相关论文
共 26 条
  • [1] Agarwal G, 2010, NEMB2010: PROCEEDINGS OF THE ASME FIRST GLOBAL CONGRESS ON NANOENGINEERING FOR MEDICINE AND BIOLOGY - 2010, P23
  • [2] A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
    Bauschke, Heinz H.
    Bolte, Jerome
    Teboulle, Marc
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2017, 42 (02) : 330 - 348
  • [3] Bertsekas D., 1989, PARALLEL DISTRIBUTED
  • [4] Boyd S., 2004, Convex Optimization, DOI 10.1017/CBO9780511804441
  • [5] On the Learning Behavior of Adaptive Networks-Part II: Performance Analysis
    Chen, Jianshu
    Sayed, Ali H.
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2015, 61 (06) : 3518 - 3548
  • [6] NEXT: In-Network Nonconvex Optimization
    Di Lorenzo, Paolo
    Scutari, Gesualdo
    [J]. IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2016, 2 (02): : 120 - 136
  • [7] Fastest rates for stochastic mirror descent methods
    Hanzely, Filip
    Richtarik, Peter
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (03) : 717 - 766
  • [8] Konecny J., 2016, NEURIPS WORKSHOP PRI, P1
  • [9] Lu H., 2017, RELATIVE CONTINUITY
  • [10] RELATIVELY SMOOTH CONVEX OPTIMIZATION BY FIRST-ORDER METHODS, AND APPLICATIONS
    Lu, Haihao
    Freund, Robert M.
    Nesterov, Yurii
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (01) : 333 - 354