Semi-Federated Learning: Convergence Analysis and Optimization of a Hybrid Learning Framework

被引:9
|
作者
Zheng, Jingheng [1 ]
Ni, Wanli [1 ]
Tian, Hui [1 ]
Gunduz, Deniz [2 ]
Quek, Tony Q. S. [3 ,4 ]
Han, Zhu [5 ,6 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[2] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2AZ, England
[3] Singapore Univ Technol & Design, Pillar Informat Syst Technologyand Design, Singapore 487372, Singapore
[4] Kyung Hee Univ, Dept Elect Engn, Yongin 17104, South Korea
[5] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77004 USA
[6] Kyung Hee Univ, Dept Comp Sci & Engn, Seoul 446701, South Korea
关键词
Convergence; Computational modeling; Transceivers; Training; NOMA; Data models; Privacy; Semi-federated learning; communication efficiency; convergence analysis; transceiver design; RESOURCE-ALLOCATION; COMMUNICATION-EFFICIENT; MIMO-NOMA; COMPUTATION; MINIMIZATION; DESIGN;
D O I
10.1109/TWC.2023.3270908
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Under the organization of the base station (BS), wireless federated learning (FL) enables collaborative model training among multiple devices. However, the BS is merely responsible for aggregating local updates during the training process, which incurs a waste of the computational resources at the BS. To tackle this issue, we propose a semi-federated learning (SemiFL) paradigm to leverage the computing capabilities of both the BS and devices for a hybrid implementation of centralized learning (CL) and FL. Specifically, each device sends both local gradients and data samples to the BS for training a shared global model. To improve communication efficiency over the same time-frequency resources, we integrate over-the-air computation for aggregation and non-orthogonal multiple access for transmission by designing a novel transceiver structure. To gain deep insights, we conduct convergence analysis by deriving a closed-form optimality gap for SemiFL and extend the result to two extra cases. In the first case, the BS uses all accumulated data samples to calculate the CL gradient, while a decreasing learning rate is adopted in the second case. Our analytical results capture the destructive effect of wireless communication and show that both FL and CL are special cases of SemiFL. Then, we formulate a non-convex problem to reduce the optimality gap by jointly optimizing the transmit power and receive beamformers. Accordingly, we propose a two-stage algorithm to solve this intractable problem, in which we provide closed-form solutions to the beamformers. Extensive simulation results on two real-world datasets corroborate our theoretical analysis, and show that the proposed SemiFL outperforms conventional FL and achieves 3.2% accuracy gain on the MNIST dataset compared to state-of-the-art benchmarks.
引用
收藏
页码:9438 / 9456
页数:19
相关论文
共 50 条
  • [21] Concentrated Differentially Private Federated Learning With Performance Analysis
    Hu, Rui
    Guo, Yuanxiong
    Gong, Yanmin
    IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY, 2021, 2 : 276 - 289
  • [22] Federated Learning Over Wireless Networks: Convergence Analysis and Resource Allocation
    Dinh, Canh T.
    Tran, Nguyen H.
    Nguyen, Minh N. H.
    Hong, Choong Seon
    Bao, Wei
    Zomaya, Albert Y.
    Gramoli, Vincent
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2021, 29 (01) : 398 - 409
  • [23] A Hybrid Federated Learning Architecture With Online Learning and Model Compression
    Odeyomi, Olusola T.
    Ajibuwa, Opeyemi
    Roy, Kaushik
    IEEE ACCESS, 2024, 12 : 191046 - 191058
  • [24] A DAG-Blockchain-Assisted Federated Learning Framework in Wireless Networks: Learning Performance and Throughput Optimization Schemes
    Wang, Qiang
    Xu, Shaoyi
    Xu, Rongtao
    Ai, Bo
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2025, 74 (03) : 5097 - 5113
  • [25] The Role of Communication Time in the Convergence of Federated Edge Learning
    Zhou, Yipeng
    Fu, Yao
    Luo, Zhenxiao
    Hu, Miao
    Wu, Di
    Sheng, Quan Z.
    Yu, Shui
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (03) : 3241 - 3254
  • [26] Hierarchical Federated Learning With Quantization: Convergence Analysis and System Design
    Liu, Lumin
    Zhang, Jun
    Song, Shenghui
    Letaief, Khaled B.
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (01) : 2 - 18
  • [27] A Novel Framework for the Analysis and Design of Heterogeneous Federated Learning
    Wang, Jianyu
    Liu, Qinghua
    Liang, Hao
    Gauri, Joshi
    Poor, H. Vincent
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 5234 - 5249
  • [28] On Safeguarding Privacy and Security in the Framework of Federated Learning
    Ma, Chuan
    Li, Jun
    Ding, Ming
    Yang, Howard H.
    Shu, Feng
    Quek, Tony Q. S.
    Poor, H. Vincent
    IEEE NETWORK, 2020, 34 (04): : 242 - 248
  • [29] Federated Learning Convergence Optimization for Energy-Limited and Social-Aware Edge Nodes
    Ling, Xiaoling
    Chi, Weicheng
    Zhang, Jinjuan
    Li, Zhonghang
    IEEE ACCESS, 2024, 12 : 107844 - 107854
  • [30] Performance Optimization for Noise Interference Privacy Protection in Federated Learning
    Peng, Zihao
    Li, Boyuan
    Li, Le
    Chen, Shengbo
    Wang, Guanghui
    Rao, Hong
    Shen, Cong
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2023, 9 (05) : 1322 - 1339