Federated Learning Over Wireless Networks: Challenges and Solutions

被引:21
作者
Beitollahi, Mahdi [1 ]
Lu, Ning [1 ]
机构
[1] Queens Univ, Dept Elect & Comp Engn, Kingston, ON K7L 3N6, Canada
关键词
Communication resources; federated learning (FL); power limitation; wireless networks; STOCHASTIC GRADIENT DESCENT; PRIVACY; OPTIMIZATION; CONVERGENCE; FRAMEWORK; SECURITY;
D O I
10.1109/JIOT.2023.3285868
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Motivated by ever-increasing computational resources at edge devices and increasing privacy concerns, a new machine learning (ML) framework called federated learning (FL) has been proposed. FL enables user devices, such as mobile and Internet of Things (IoT) devices, to collaboratively train an ML model by only sending the model parameters instead of raw data. FL is considered the key enabling approach for privacy-preserving, distributed ML systems. However, FL requires frequent exchange of learned model updates between multiple user devices and the cloud/edge server, which introduces a significant communication overhead and hence imposes a major challenge in FL over wireless networks that are limited in communication resources. Moreover, FL consumes a considerable amount of energy in the process of transmitting learned model updates, which imposes another challenge in FL over wireless networks that usually include unplugged devices with limited battery resources. Besides, there are still other privacy issues in practical implementations of FL over wireless networks. In this survey, we discuss each of the mentioned challenges and their respective state-of-the-art proposed solutions in an in-depth manner. By illustrating the tradeoff between each of the solutions, we discuss the underlying effect of the wireless network on the performance of FL. Finally, by highlighting the gaps between research and practical implementations, we identify future research directions for engineering FL over wireless networks.
引用
收藏
页码:14749 / 14763
页数:15
相关论文
共 91 条
[1]   Federated Learning in Edge Computing: A Systematic Survey [J].
Abreha, Haftay Gebreslasie ;
Hayajneh, Mohammad ;
Serhani, Mohamed Adel .
SENSORS, 2022, 22 (02)
[2]  
Alistarh D, 2017, ADV NEUR IN, V30
[3]   Convergence of Update Aware Device Scheduling for Federated Learning at the Wireless Edge [J].
Amiri, Mohammad Mohammadi ;
Gunduz, Deniz ;
Kulkarni, Sanjeev R. ;
Poor, H. Vincent .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (06) :3643-3658
[4]   Federated Learning Over Wireless Fading Channels [J].
Amiri, Mohammad Mohammadi ;
Gunduz, Deniz .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2020, 19 (05) :3546-3557
[5]   Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air [J].
Amiri, Mohammad Mohammadi ;
Gunduz, Deniz .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (68) :2155-2169
[6]   Robust Federated Learning With Noisy Communication [J].
Ang, Fan ;
Chen, Li ;
Zhao, Nan ;
Chen, Yunfei ;
Wang, Weidong ;
Yu, F. Richard .
IEEE TRANSACTIONS ON COMMUNICATIONS, 2020, 68 (06) :3452-3464
[7]  
[Anonymous], 2023, Gpt4
[8]  
Bagdasaryan E, 2020, PR MACH LEARN RES, V108, P2938
[9]   DSFL: Dynamic Sparsification for Federated Learning [J].
Beitollahi, Mahdi ;
Liu, Mingrui ;
Lu, Ning .
2022 5TH INTERNATIONAL CONFERENCE ON COMMUNICATIONS, SIGNAL PROCESSING, AND THEIR APPLICATIONS (ICCSPA), 2022,
[10]   FLAC: Federated Learning with Autoencoder Compression and Convergence Guarantee [J].
Beitollahi, Mahdi ;
Lu, Ning .
2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, :4589-4594