Communication-Efficient Federated Learning Over Capacity-Limited Wireless Networks

被引:0
作者
Yun, Jaewon [1 ]
Oh, Yongjeong [1 ]
Jeon, Yo-Seb [1 ]
Poor, H. Vincent [2 ]
机构
[1] Pohang Univ Sci & Technol, Dept Elect Engn, Pohang 37673, Gyeongbuk, South Korea
[2] Princeton Univ, Dept Elect & Comp Engn, Princeton, NJ 08544 USA
基金
新加坡国家研究基金会; 美国国家科学基金会;
关键词
Uplink; Vectors; Convergence; Encoding; Training; Quantization (signal); Solid modeling; Federated learning; distributed machine learning; sparsification; compression; communication efficiency; QUANTIZATION;
D O I
10.1109/TCCN.2024.3419039
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
In this paper, we propose a communication-efficient federated learning (FL) framework to enhance the convergence rate of FL under limited uplink capacity. The core idea of our framework is to transmit the values and positions of the top-S entries of a local model update, determined in terms of magnitude. When transmitting the top-S values, we first apply a linear transformation that enforces the transformed values to behave like Gaussian random variables. We then employ a scalar quantizer optimized for Gaussian distributions, leading to minimizing compression errors. When reconstructing the top-S values, we develop a linear minimum mean squared error method based on the Bussgang decomposition. Additionally, we introduce an error feedback strategy to compensate for both compression and reconstruction errors. We analyze the convergence rate of our framework under general considerations, including a non-convex loss function. Based on our analytical results, we optimize the key parameters of our framework to maximize the convergence rate for a given uplink capacity. Simulation results demonstrate that our framework achieves more than a 2.2%, 1.1%, and 1.4% increase in classification accuracy for the MNIST, CIFAR-10, and CIFAR-100 datasets, respectively, compared to state-of-the-art FL frameworks under limited uplink capacity.
引用
收藏
页码:621 / 637
页数:17
相关论文
共 47 条
[1]  
Alistarh D, 2017, ADV NEUR IN, V30
[2]   Convergence of Federated Learning Over a Noisy Downlink [J].
Amiri, Mohammad Mohammadi ;
Gunduz, Deniz ;
Kulkarni, Sanjeev R. ;
Poor, H. Vincent .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (03) :1422-1437
[3]   Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air [J].
Amiri, Mohammad Mohammadi ;
Gunduz, Deniz .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (68) :2155-2169
[4]  
Bernstein J, 2018, PR MACH LEARN RES, V80
[5]  
Bussgang J. J., 1952, Rep. TR-216
[6]   Federated Learning Over Wireless IoT Networks With Optimized Communication and Resources [J].
Chen, Hao ;
Huang, Shaocheng ;
Zhang, Deyou ;
Xiao, Ming ;
Skoglund, Mikael ;
Poor, H. Vincent .
IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (17) :16592-16605
[7]   Convergence Time Optimization for Federated Learning Over Wireless Networks [J].
Chen, Mingzhe ;
Poor, H. Vincent ;
Saad, Walid ;
Cui, Shuguang .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (04) :2457-2471
[8]   Communication-efficient federated learning [J].
Chen, Mingzhe ;
Shlezinger, Nir ;
Poor, H. Vincent ;
Eldar, Yonina C. ;
Cui, Shuguang .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
[9]   Federated Echo State Learning for Minimizing Breaks in Presence in Wireless Virtual Reality Networks [J].
Chen, Mingzhe ;
Semiari, Omid ;
Saad, Walid ;
Liu, Xuanlin ;
Yin, Changchuan .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2020, 19 (01) :177-191
[10]   Blockchain Assisted Federated Learning Over Wireless Channels: Dynamic Resource Allocation and Client Scheduling [J].
Deng, Xiumei ;
Li, Jun ;
Ma, Chuan ;
Wei, Kang ;
Shi, Long ;
Ding, Ming ;
Chen, Wen ;
Poor, H. Vincent .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (05) :3537-3553