Decentralized Over-the-Air Federated Learning by Second-Order Optimization Method

被引:4
作者
Yang, Peng [1 ,2 ]
Jiang, Yuning [3 ]
Wen, Dingzhu [4 ]
Wang, Ting [1 ,2 ]
Jones, Colin N. [3 ]
Shi, Yuanming [4 ]
机构
[1] East China Normal Univ, Shanghai Key Lab Trustworthy Comp, Shanghai 200062, Peoples R China
[2] East China Normal Univ, MoE Engn Res Ctr Soft ware Hardware Codesign Techn, Shanghai 200062, Peoples R China
[3] Ecole Polytech Fed Lausanne, Automat Control Lab, CH-1015 Lausanne, Switzerland
[4] ShanghaiTech Univ, Sch Informat Sci & Technol, Shanghai 201210, Peoples R China
关键词
Decentralized federated learning; over-the-air computation; second-order optimization method; COMMUNICATION; COMPUTATION; CHALLENGES; ALGORITHMS; NETWORKS; PRIVACY;
D O I
10.1109/TWC.2023.3327610
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) is an emerging technique that enables privacy-preserving distributed learning. Most related works focus on centralized FL, which leverages the coordination of a parameter server to implement local model aggregation. However, this scheme heavily relies on the parameter server, which could cause scalability, communication, and reliability issues. To tackle these problems, decentralized FL, where information is shared through gossip, starts to attract attention. Nevertheless, current research mainly relies on first-order optimization methods that have a relatively slow convergence rate, which leads to excessive communication rounds in wireless networks. To design communication-efficient decentralized FL, we propose a novel over-the-air decentralized second-order federated algorithm. Benefiting from the fast convergence rate of the second-order method, total communication rounds are significantly reduced. Meanwhile, owing to the low-latency model aggregation enabled by over-the-air computation, the communication overheads in each round can also be greatly decreased. The convergence behavior of our approach is then analyzed. The result reveals an error term, which involves a cumulative noise effect, in each iteration. To mitigate the impact of this error term, we conduct system optimization from the perspective of the accumulative term and the individual term, respectively. Numerical experiments demonstrate the superiority of our proposed approach and the effectiveness of system optimization.
引用
收藏
页码:5632 / 5647
页数:16
相关论文
共 56 条
  • [31] Lian XR, 2017, ADV NEUR IN, V30
  • [32] Privacy for Free: Wireless Federated Learning via Uncoded Transmission With Adaptive Power Control
    Liu, Dongzhu
    Simeone, Osvaldo
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (01) : 170 - 185
  • [33] Reconfigurable Intelligent Surface Enabled Federated Learning: A Unified Communication-Learning Design Approach
    Liu, Hang
    Yuan, Xiaojun
    Zhang, Ying-Jun Angela
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (11) : 7595 - 7609
  • [34] McMahan HB, 2017, PR MACH LEARN RES, V54, P1273
  • [35] Network Newton Distributed Optimization Methods
    Mokhtari, Aryan
    Ling, Qing
    Ribeiro, Alejandro
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (01) : 146 - 161
  • [36] Computation over multiple-access channels
    Nazer, Bobak
    Gastpar, Michael
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2007, 53 (10) : 3498 - 3516
  • [37] Decentralized SGD with Over-the-Air Computation
    Ozfatura, E.
    Rini, Stefano
    Gunduz, D.
    [J]. 2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [38] Opportunities of Federated Learning in Connected, Cooperative, and Automated Industrial Systems
    Savazzi, Stefano
    Nicoli, Monica
    Bennis, Mehdi
    Kianoush, Sanaz
    Barbieri, Luca
    [J]. IEEE COMMUNICATIONS MAGAZINE, 2021, 59 (02) : 16 - 21
  • [39] On Analog Gradient Descent Learning Over Multiple Access Fading Channels
    Sery, Tomer
    Cohen, Kobi
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 2897 - 2911
  • [40] Over-the-Air Decentralized Federated Learning
    Shi, Yandong
    Zhou, Yong
    Shi, Yuanming
    [J]. 2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 455 - 460