Wireless Distributed Learning: A New Hybrid Split and Federated Learning Approach

被引:31
|
作者
Liu, Xiaolan [1 ]
Deng, Yansha [2 ]
Mahmoodi, Toktam [2 ]
机构
[1] Loughborough Univ, Inst Digital Technol, London E20 3BS, England
[2] Kings Coll London, Dept Engn, London WC2R 2LS, England
基金
英国工程与自然科学研究理事会;
关键词
Computational modeling; Wireless communication; Training; Autonomous aerial vehicles; Distance learning; Computer aided instruction; Data models; Wireless unmanned aerial vehicles (UAV) Networks; Federated learning (FL); Multi-Arm Bandit (MAB); Split learning (SL); User (UE) selection; UAVS;
D O I
10.1109/TWC.2022.3213411
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Cellular-connected unmanned aerial vehicle (UAV) with flexible deployment is foreseen to be a major part of the sixth generation (6G) networks. The UAVs connected to the base station (BS), as aerial users (UEs), could exploit machine learning (ML) algorithms to provide a wide range of advanced applications, like object detection and video tracking. Conventionally, the ML model training is performed at the BS, known as centralized learning (CL), which causes high communication overhead due to the transmission of large datasets, and potential concerns about UE privacy. To address this, distributed learning algorithms, including federated learning (FL) and split learning (SL), were proposed to train the ML models in a distributed manner via only sharing model parameters. FL requires higher computational resource on the UE side than SL, while SL has larger communication overhead when the local dataset is large. To effectively train an ML model considering the diversity of UEs with different computational capabilities and channel conditions, we first propose a novel distributed learning architecture, a hybrid split and federated learning (HSFL) algorithm by reaping the parallel model training mechanism of FL and the model splitting structure of SL. We then provide its convergence analysis under non-independent and identically distributed (non-IID) data with random UE selection scheme. By conducting experiments on training two ML models, Net and AlexNet, in wireless UAV networks, our results demonstrate that the HSFL algorithm achieves higher learning accuracy than FL and less communication overhead than SL under IID and non-IID data, and the learning accuracy of HSFL algorithm increases with the increasing number of the split training UEs. We further propose a Multi-Arm Bandit (MAB) based best channel (BC) and best 2-norm (BN2) (MAB-BC-BN2) UE selection scheme to select the UEs with better wireless channel quality and larger local model updates for model training in each round. Numerical results demonstrate it achieves higher learning accuracy than BC, MAB-BC and MAB-BN2 UE selection scheme under non-IID, Dirichlet-nonIID and Dirichlet-Imbalanced data.
引用
收藏
页码:2650 / 2665
页数:16
相关论文
共 50 条
  • [21] FedSL: Federated split learning on distributed sequential data in recurrent neural networks
    Abedi, Ali
    Khan, Shehroz S.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (10) : 28891 - 28911
  • [22] FedSL: Federated split learning on distributed sequential data in recurrent neural networks
    Ali Abedi
    Shehroz S. Khan
    Multimedia Tools and Applications, 2024, 83 : 28891 - 28911
  • [23] Optimizing Privacy and Latency Tradeoffs in Split Federated Learning Over Wireless Networks
    Lee, Joohyung
    Seif, Mohamed
    Cho, Jungchan
    Poor, H. Vincent
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2024, 13 (12) : 3439 - 3443
  • [24] Latency Minimization for Split Federated Learning
    Guo, Jie
    Xu, Ce
    Ling, Yushi
    Liu, Yuan
    Yu, Qi
    2023 IEEE 98TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2023-FALL, 2023,
  • [25] Federated Deep Reinforcement Learning for the Distributed Control of NextG Wireless Networks
    Tehrani, Peyman
    Restuccia, Francesco
    Levorato, Marco
    2021 IEEE INTERNATIONAL SYMPOSIUM ON DYNAMIC SPECTRUM ACCESS NETWORKS (DYSPAN), 2021, : 248 - 253
  • [26] INTEGRATED DISTRIBUTED WIRELESS SENSING WITH OVER-THE-AIR FEDERATED LEARNING
    Gao, Shijian
    Yan, Jia
    Giannakis, Georgios B.
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 600 - 603
  • [27] Federated Learning and Wireless Communications
    Qin, Zhijin
    Li, Geoffrey Ye
    Ye, Hao
    IEEE WIRELESS COMMUNICATIONS, 2021, 28 (05) : 134 - 140
  • [28] Efficiently Distributed Federated Learning
    Mittone, Gianluca
    Birke, Robert
    Aldinucci, Marco
    EURO-PAR 2023: PARALLEL PROCESSING WORKSHOPS, PT II, EURO-PAR 2023, 2024, 14352 : 321 - 326
  • [29] TurboFed: A Federated Learning Approach to The PHM of Distributed Wind Turbines
    Chen, Bo
    Zhu, Yongxin
    Guo, Yu
    Xu, Shiyuan
    PROCEEDINGS OF THE 2024 IEEE 10TH IEEE INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE AND SMART COMPUTING, HPSC 2024, 2024, : 105 - 109
  • [30] Convergence Acceleration in Wireless Federated Learning: A Stackelberg Game Approach
    Wang, Kaidi
    Ma, Yi
    Mashhadi, Mahdi Boloursaz
    Foh, Chuan Heng
    Tafazolli, Rahim
    Ding, Zhi
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2025, 74 (01) : 714 - 729