Decentralized Federated Learning for Over-Parameterized Models

被引:3
作者
Qin, Tiancheng [1 ]
Etesami, S. Rasoul [1 ]
Uribe, Cesar A. [2 ]
机构
[1] Univ Illinois, Dept Ind & Enterprise Syst Engn, Coordinated Sci Lab, Urbana, IL 61801 USA
[2] Rice Univ, Dept Elect & Comp Engn, Houston, TX 77005 USA
来源
2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC) | 2022年
基金
美国国家科学基金会;
关键词
Decentralized Federated Learning; Decentralized Optimization; Local SGD; Overparameterization;
D O I
10.1109/CDC51059.2022.9992924
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Modern machine learning, especially deep learning, features models that are often highly expressive and over-parameterized. They can interpolate the data by driving the empirical loss close to zero. We analyze the convergence rate of decentralized stochastic gradient descent (SGD), which is at the core of decentralized federated learning (DFL), for these over-parameterized models. Our analysis covers the setting of decentralized SGD with time-varying networks, local updates and heterogeneous data. We establish strong convergence guarantees with or without the assumption of convex objectives that either improves upon the existing literature or is the first for the regime.
引用
收藏
页码:5200 / 5205
页数:6
相关论文
共 24 条
  • [1] Belkin Mikhail, 2018, P MACHINE LEARNING R, V80
  • [2] Gorbunov E., 2021, P MACHINE LEARNING R, VVolume 130, P3556
  • [3] Hard Andrew, 2018, ARXIV181103604
  • [4] Karimireddy SP, 2020, PR MACH LEARN RES, V119
  • [5] Khaled A, 2020, PR MACH LEARN RES, V108, P4519
  • [6] Koloskova A, 2020, PR MACH LEARN RES, V119
  • [7] Federated Learning: Challenges, Methods, and Future Directions
    Li, Tian
    Sahu, Anit Kumar
    Talwalkar, Ameet
    Smith, Virginia
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2020, 37 (03) : 50 - 60
  • [8] Li X., 2019, ARXIV191009126
  • [9] Li X., 2019, ARXIV190702189, P1
  • [10] Lian XR, 2017, ADV NEUR IN, V30