Communication-Efficient Federated Learning For Massive MIMO Systems

被引:3
|
作者
Mu, Yuchen [1 ]
Garg, Navneet [1 ]
Ratnarajah, Tharmalingam [1 ]
机构
[1] Univ Edinburgh, Sch Engn, Edinburgh, Midlothian, Scotland
基金
英国工程与自然科学研究理事会;
关键词
Deep learning; federated learning; massive MIMO; CONVERGENCE;
D O I
10.1109/WCNC51071.2022.9771851
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is an emerging distributed learning algorithm where the process of data acquisition and computation are decoupled to preserve users' data privacy. In the training process, model weights have to be updated at both base station (BS) and local users sides. These weights, when exchanged between users and BS, are subjected to imperfections in uplink (UL) and downlink (DL) transmissions due to limited reliability of wireless channels. In this paper, for a FL algorithm in a single-cell massive MIMO cellular communication system, we investigate the impacts of both DL and UL transmissions and improve the communication-efficiency by adjusting global communication rounds, transmit power and average codeword length after quantization. Simulation results on standard MNIST dataset with both i.i.d and non-i.i.d training data distributions are also presented. Our simulation results have shown accelerated learning for various local steps and transmit power. The network energy consumption has been reduced while achieving similar testing accuracy at higher iterations.
引用
收藏
页码:578 / 583
页数:6
相关论文
共 50 条
  • [31] Communication-Efficient Federated Learning with Adaptive Consensus ADMM
    He, Siyi
    Zheng, Jiali
    Feng, Minyu
    Chen, Yixin
    APPLIED SCIENCES-BASEL, 2023, 13 (09):
  • [32] Communication-Efficient Federated Learning With Gradual Layer Freezing
    Malan, Erich
    Peluso, Valentino
    Calimera, Andrea
    Macii, Enrico
    IEEE EMBEDDED SYSTEMS LETTERS, 2023, 15 (01) : 25 - 28
  • [33] Communication-Efficient Federated Learning Based on Compressed Sensing
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (20) : 15531 - 15541
  • [34] On the Convergence of Communication-Efficient Local SGD for Federated Learning
    Gao, Hongchang
    Xu, An
    Huang, Heng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7510 - 7518
  • [35] A Cooperative Analysis to Incentivize Communication-Efficient Federated Learning
    Li, Youqi
    Li, Fan
    Yang, Song
    Zhang, Chuan
    Zhu, Liehuang
    Wang, Yu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 10175 - 10190
  • [36] FedDQ: Communication-Efficient Federated Learning with Descending Quantization
    Qu, Linping
    Song, Shenghui
    Tsui, Chi-Ying
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 281 - 286
  • [37] Communication-efficient hierarchical federated learning for IoT heterogeneous systems with imbalanced data
    Abdellatif, Alaa Awad
    Mhaisen, Naram
    Mohamed, Amr
    Erbad, Aiman
    Guizani, Mohsen
    Dawy, Zaher
    Nasreddine, Wassim
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2022, 128 : 406 - 419
  • [38] Communication-Efficient Generalized Neuron Matching for Federated Learning
    Hu, Sixu
    Li, Qinbin
    He, Bingsheng
    PROCEEDINGS OF THE 52ND INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2023, 2023, : 254 - 263
  • [39] Communication-Efficient Federated Learning for Digital Twin Systems of Industrial Internet of Things
    Zhao, Yunming
    Li, Li
    Liu, Ying
    Fan, Yuxi
    Lin, Kuo-Yi
    IFAC PAPERSONLINE, 2022, 55 (02): : 433 - 438
  • [40] Communication-efficient federated learning with stagewise training strategy
    Cheng, Yifei
    Shen, Shuheng
    Liang, Xianfeng
    Liu, Jingchang
    Chen, Joya
    Zhang, Tie
    Chen, Enhong
    NEURAL NETWORKS, 2023, 167 : 460 - 472