An Adaptive Compression and Communication Framework for Wireless Federated Learning

被引:4
作者
Yang, Yang [1 ]
Dang, Shuping [2 ]
Zhang, Zhenrong [1 ]
机构
[1] Guangxi Univ, Guangxi Key Lab Multimedia Commun Network Technol, Sch Comp & Elect Informat, Nanning, Peoples R China
[2] Univ Bristol, Sch Elect Elect & Mech Engn, Bristol, England
关键词
Convergence; Training; Vectors; Computational modeling; Quantization (signal); Communication system security; Optimization; Federated learning; communication-computing trade-off; distributed machine learning; joint optimization; model compression;
D O I
10.1109/TMC.2024.3382776
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a distributed privacy-preserving paradigm of machine learning that enables efficient and secure model training through the collaboration of multiple clients. However, imperfect channel estimation and resource constraints of edge devices severely hinder the convergence of typical wireless FL, while the trade-off between communications and computation still lacks in-depth exploration. These factors lead to inefficient communications and hinder the full potential of FL from being unleashed. In this regard, we formulate a joint optimization problem of communications and learning in wireless networks subject to dynamic channel variations. For addressing the formulated problem, we propose an integrated adaptive $n$n-ary compression and resource management framework (ANC) that is capable of adjusting the selection of edge devices and compression schemes, and allocates the optimal resource blocks and transmit power to each participating device, which effectively improves the energy efficiency and scalability of FL in resource-constrained environments. Furthermore, an upper bound on the expected global convergence rate is derived in this paper to quantify the impacts of transmitted data volume and wireless propagation on the convergence of FL. Simulation results demonstrate that the proposed adaptive framework achieves much faster convergence while maintaining considerably low communication overhead.
引用
收藏
页码:10835 / 10854
页数:20
相关论文
共 47 条
[1]   Convergence of Federated Learning Over a Noisy Downlink [J].
Amiri, Mohammad Mohammadi ;
Gunduz, Deniz ;
Kulkarni, Sanjeev R. ;
Poor, H. Vincent .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (03) :1422-1437
[2]  
[Anonymous], 2022, IEEE Open J. Commun. Soc., V3, P1252
[3]   Data obfuscation: Anonymity and desensitization of usable data sets [J].
Bakken, DE ;
Parameswaran, R ;
Blough, DM ;
Franz, AA ;
Palmer, TJ .
IEEE SECURITY & PRIVACY, 2004, 2 (06) :34-41
[4]  
Bayardo RJ, 2005, PROC INT CONF DATA, P217
[5]   A Joint Learning and Communications Framework for Federated Learning Over Wireless Networks [J].
Chen, Mingzhe ;
Yang, Zhaohui ;
Saad, Walid ;
Yin, Changchuan ;
Poor, H. Vincent ;
Cui, Shuguang .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (01) :269-283
[6]  
Chen N., 2021, IEEE INT C COMMUN, P1
[7]  
Chen XH, 2018, IEEE INT CONF BIG DA, P1178, DOI 10.1109/BigData.2018.8622598
[8]   What should 6G be? [J].
Dang, Shuping ;
Amin, Osama ;
Shihada, Basem ;
Alouini, Mohamed-Slim .
NATURE ELECTRONICS, 2020, 3 (01) :20-29
[9]   Resource Allocation for Full-Duplex Relay-Assisted Device-to-Device Multicarrier Systems [J].
Dang, Shuping ;
Coon, Justin P. ;
Chen, Gaojie .
IEEE WIRELESS COMMUNICATIONS LETTERS, 2017, 6 (02) :166-169
[10]   EaSTFLy: Efficient and secure ternary federated learning [J].
Dong, Ye ;
Chen, Xiaojun ;
Shen, Liyan ;
Wang, Dakui .
COMPUTERS & SECURITY, 2020, 94