Semi-Federated Learning: Convergence Analysis and Optimization of a Hybrid Learning Framework

被引:9
|
作者
Zheng, Jingheng [1 ]
Ni, Wanli [1 ]
Tian, Hui [1 ]
Gunduz, Deniz [2 ]
Quek, Tony Q. S. [3 ,4 ]
Han, Zhu [5 ,6 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[2] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2AZ, England
[3] Singapore Univ Technol & Design, Pillar Informat Syst Technologyand Design, Singapore 487372, Singapore
[4] Kyung Hee Univ, Dept Elect Engn, Yongin 17104, South Korea
[5] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77004 USA
[6] Kyung Hee Univ, Dept Comp Sci & Engn, Seoul 446701, South Korea
关键词
Convergence; Computational modeling; Transceivers; Training; NOMA; Data models; Privacy; Semi-federated learning; communication efficiency; convergence analysis; transceiver design; RESOURCE-ALLOCATION; COMMUNICATION-EFFICIENT; MIMO-NOMA; COMPUTATION; MINIMIZATION; DESIGN;
D O I
10.1109/TWC.2023.3270908
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Under the organization of the base station (BS), wireless federated learning (FL) enables collaborative model training among multiple devices. However, the BS is merely responsible for aggregating local updates during the training process, which incurs a waste of the computational resources at the BS. To tackle this issue, we propose a semi-federated learning (SemiFL) paradigm to leverage the computing capabilities of both the BS and devices for a hybrid implementation of centralized learning (CL) and FL. Specifically, each device sends both local gradients and data samples to the BS for training a shared global model. To improve communication efficiency over the same time-frequency resources, we integrate over-the-air computation for aggregation and non-orthogonal multiple access for transmission by designing a novel transceiver structure. To gain deep insights, we conduct convergence analysis by deriving a closed-form optimality gap for SemiFL and extend the result to two extra cases. In the first case, the BS uses all accumulated data samples to calculate the CL gradient, while a decreasing learning rate is adopted in the second case. Our analytical results capture the destructive effect of wireless communication and show that both FL and CL are special cases of SemiFL. Then, we formulate a non-convex problem to reduce the optimality gap by jointly optimizing the transmit power and receive beamformers. Accordingly, we propose a two-stage algorithm to solve this intractable problem, in which we provide closed-form solutions to the beamformers. Extensive simulation results on two real-world datasets corroborate our theoretical analysis, and show that the proposed SemiFL outperforms conventional FL and achieves 3.2% accuracy gain on the MNIST dataset compared to state-of-the-art benchmarks.
引用
收藏
页码:9438 / 9456
页数:19
相关论文
共 50 条
  • [31] Clustered Vehicular Federated Learning: Process and Optimization
    Taik, Afaf
    Mlika, Zoubeir
    Cherkaoui, Soumaya
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (12) : 25371 - 25383
  • [32] FedCD: A Hybrid Federated Learning Framework for Efficient Training With IoT Devices
    Liu, Jianchun
    Huo, Yujia
    Qu, Pengcheng
    Xu, Sun
    Liu, Zhi
    Ma, Qianpiao
    Huang, Jinyang
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (11): : 20040 - 20050
  • [33] Deploying Federated Learning in Large-Scale Cellular Networks: Spatial Convergence Analysis
    Lin, Zhenyi
    Li, Xiaoyang
    Lau, Vincent K. N.
    Gong, Yi
    Huang, Kaibin
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (03) : 1542 - 1556
  • [34] STAR-RIS Integrated Nonorthogonal Multiple Access and Over-the-Air Federated Learning: Framework, Analysis, and Optimization
    Ni, Wanli
    Liu, Yuanwei
    Eldar, Yonina C.
    Yang, Zhaohui
    Tian, Hui
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (18) : 17136 - 17156
  • [35] Global Convergence of Federated Learning for Mixed Regression
    Su, Lili
    Xu, Jiaming
    Yang, Pengkun
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (09) : 6391 - 6411
  • [36] Convergence of Federated Learning Over a Noisy Downlink
    Amiri, Mohammad Mohammadi
    Gunduz, Deniz
    Kulkarni, Sanjeev R.
    Poor, H. Vincent
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (03) : 1422 - 1437
  • [37] On the Federated Learning Framework for Cooperative Perception
    Zhang, Zhenrong
    Liu, Jianan
    Zhou, Xi
    Huang, Tao
    Han, Qing-Long
    Liu, Jingxin
    Liu, Hongbin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (11): : 9423 - 9430
  • [38] A Decentralized Federated Learning Framework via Committee Mechanism With Convergence Guarantee
    Che, Chunjiang
    Li, Xiaoli
    Chen, Chuan
    He, Xiaoyu
    Zheng, Zibin
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (12) : 4783 - 4800
  • [39] FedSeq: A Hybrid Federated Learning Framework Based on Sequential In-Cluster Training
    Chen, Zhikun
    Li, Daofeng
    Ni, Rui
    Zhu, Jinkang
    Zhang, Sihai
    IEEE SYSTEMS JOURNAL, 2023, 17 (03): : 4038 - 4049
  • [40] Optimized Federated Multitask Learning in Mobile Edge Networks: A Hybrid Client Selection and Model Aggregation Approach
    Hamood, Moqbel
    Albaseer, Abdullatif
    Abdallah, Mohamed
    Al-Fuqaha, Ala
    Mohamed, Amr
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (11) : 17613 - 17629