Performance Analysis for Resource Constrained Decentralized Federated Learning Over Wireless Networks

被引:3
作者
Yan, Zhigang [1 ]
Li, Dong [1 ]
机构
[1] Macau Univ Sci & Technol, Sch Comp Sci & Engn, Macau, Peoples R China
关键词
Decentralized federated learning; resource constraint; package error; fading channel; CONVERGENCE; ALGORITHM;
D O I
10.1109/TCOMM.2024.3362143
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) can generate huge communication overhead for the central server, which may cause operational challenges. Furthermore, the central server's failure or compromise may result in a breakdown of the entire system. To mitigate this issue, decentralized federated learning (DFL) has been proposed as a more resilient framework that does not rely on a central server, as demonstrated in previous works. DFL involves the exchange of parameters between each device through a wireless network. To optimize the communication efficiency of the DFL system, various transmission schemes have been proposed and investigated. However, the limited communication resources present a significant challenge for these schemes. Therefore, to explore the impact of constrained resources, such as computation and communication costs on the DFL, this study analyzes the model performance of resource-constrained DFL using different communication schemes (digital and analog) over wireless networks. Specifically, we provide convergence bounds for both digital and analog transmission approaches, enabling analysis of the model performance trained on DFL. Furthermore, for digital transmission, we investigate and analyze resource allocation between computation and communication and convergence rates, obtaining its communication complexity and the minimum probability of correction communication required for convergence guarantee. For analog transmission, we discuss the impact of channel fading and noise on the model performance and the maximum errors accumulation with convergence guarantee over fading channels. Finally, we conduct numerical simulations to evaluate the performance and convergence rate of convolutional neural networks (CNNs) and Vision Transformer (ViT) trained in the DFL framework on fashion-MNIST and CIFAR-10 datasets. Our simulation results validate our analysis and discussion, revealing how to improve performance by optimizing system parameters under different communication conditions.
引用
收藏
页码:4084 / 4100
页数:17
相关论文
共 50 条
  • [31] Joint Optimization of Convergence and Latency for Hierarchical Federated Learning Over Wireless Networks
    Sun, Haofeng
    Tian, Hui
    Zheng, Jingheng
    Ni, Wanli
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2024, 13 (03) : 691 - 695
  • [32] Scheduling Policies for Federated Learning in Wireless Networks
    Yang, Howard H.
    Liu, Zuozhu
    Quek, Tony Q. S.
    Poor, H. Vincent
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2020, 68 (01) : 317 - 333
  • [33] A Survey on Federated Learning for Resource-Constrained IoT Devices
    Imteaj, Ahmed
    Thakker, Urmish
    Wang, Shiqiang
    Li, Jian
    Amini, M. Hadi
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (01) : 1 - 24
  • [34] System Optimization of Federated Learning Networks With a Constrained Latency
    Zhao, Zichao
    Xia, Junjuan
    Fan, Lisheng
    Lei, Xianfu
    Karagiannidis, George K.
    Nallanathan, Arumugam
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (01) : 1095 - 1100
  • [35] Coded Computing for Low-Latency Federated Learning Over Wireless Edge Networks
    Prakash, Saurav
    Dhakal, Sagar
    Akdeniz, Mustafa Riza
    Yona, Yair
    Talwar, Shilpa
    Avestimehr, Salman
    Himayat, Nageen
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (01) : 233 - 250
  • [36] Coded Cooperative Networks for Semi-Decentralized Federated Learning
    Weng, Shudi
    Xiao, Ming
    Ren, Chao
    Skoglund, Mikael
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2025, 14 (03) : 626 - 630
  • [37] Communication-Efficient Federated Learning Over Capacity-Limited Wireless Networks
    Yun, Jaewon
    Oh, Yongjeong
    Jeon, Yo-Seb
    Poor, H. Vincent
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2025, 11 (01) : 621 - 637
  • [38] EFFICIENT AND RELIABLE OVERLAY NETWORKS FOR DECENTRALIZED FEDERATED LEARNING\ast
    Hua, Yifan
    Miller, Kevin
    Bertozzi, Andrea L.
    Qian, Chen
    Wang, Bao
    SIAM JOURNAL ON APPLIED MATHEMATICS, 2022, 82 (04) : 1558 - 1586
  • [39] On the Benefits of Multiple Gossip Steps in Communication-Constrained Decentralized Federated Learning
    Hashemi, Abolfazl
    Acharya, Anish
    Das, Rudrajit
    Vikalo, Haris
    Sanghavi, Sujay
    Dhillon, Inderjit
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2727 - 2739
  • [40] Decentralized Federated Learning With Asynchronous Parameter Sharing for Large-Scale IoT Networks
    Xie, Haihui
    Xia, Minghua
    Wu, Peiran
    Wang, Shuai
    Huang, Kaibin
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (21): : 34123 - 34139