Semi-Federated Learning: Convergence Analysis and Optimization of a Hybrid Learning Framework

被引:9
|
作者
Zheng, Jingheng [1 ]
Ni, Wanli [1 ]
Tian, Hui [1 ]
Gunduz, Deniz [2 ]
Quek, Tony Q. S. [3 ,4 ]
Han, Zhu [5 ,6 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[2] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2AZ, England
[3] Singapore Univ Technol & Design, Pillar Informat Syst Technologyand Design, Singapore 487372, Singapore
[4] Kyung Hee Univ, Dept Elect Engn, Yongin 17104, South Korea
[5] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77004 USA
[6] Kyung Hee Univ, Dept Comp Sci & Engn, Seoul 446701, South Korea
关键词
Convergence; Computational modeling; Transceivers; Training; NOMA; Data models; Privacy; Semi-federated learning; communication efficiency; convergence analysis; transceiver design; RESOURCE-ALLOCATION; COMMUNICATION-EFFICIENT; MIMO-NOMA; COMPUTATION; MINIMIZATION; DESIGN;
D O I
10.1109/TWC.2023.3270908
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Under the organization of the base station (BS), wireless federated learning (FL) enables collaborative model training among multiple devices. However, the BS is merely responsible for aggregating local updates during the training process, which incurs a waste of the computational resources at the BS. To tackle this issue, we propose a semi-federated learning (SemiFL) paradigm to leverage the computing capabilities of both the BS and devices for a hybrid implementation of centralized learning (CL) and FL. Specifically, each device sends both local gradients and data samples to the BS for training a shared global model. To improve communication efficiency over the same time-frequency resources, we integrate over-the-air computation for aggregation and non-orthogonal multiple access for transmission by designing a novel transceiver structure. To gain deep insights, we conduct convergence analysis by deriving a closed-form optimality gap for SemiFL and extend the result to two extra cases. In the first case, the BS uses all accumulated data samples to calculate the CL gradient, while a decreasing learning rate is adopted in the second case. Our analytical results capture the destructive effect of wireless communication and show that both FL and CL are special cases of SemiFL. Then, we formulate a non-convex problem to reduce the optimality gap by jointly optimizing the transmit power and receive beamformers. Accordingly, we propose a two-stage algorithm to solve this intractable problem, in which we provide closed-form solutions to the beamformers. Extensive simulation results on two real-world datasets corroborate our theoretical analysis, and show that the proposed SemiFL outperforms conventional FL and achieves 3.2% accuracy gain on the MNIST dataset compared to state-of-the-art benchmarks.
引用
收藏
页码:9438 / 9456
页数:19
相关论文
共 50 条
  • [1] Retransmission-Based Semi-Federated Learning
    Zheng, Jingheng
    Tian, Hui
    Ni, Wanli
    Nie, Gaofeng
    Jiang, Wenchao
    Quek, Tony Q. S.
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (12) : 18363 - 18379
  • [2] CONVERGENCE ANALYSIS OF SEMI-FEDERATED LEARNING WITH NON-IID DATA
    Ni, Wanli
    Han, Jiachen
    Qin, Zhijin
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW 2024, 2024, : 214 - 218
  • [3] Semi-Federated Learning
    Chen, Zhikun
    Li, Daofeng
    Zhao, Ming
    Zhang, Sihai
    Nu, Jinkang
    2020 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2020,
  • [4] Semi-Federated Learning for Connected Intelligence With Computing-Heterogeneous Devices
    Han, Jiachen
    Ni, Wanli
    Li, Li
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (21): : 34078 - 34092
  • [5] Convergence Analysis and Latency Minimization for Retransmission-Based Semi-Federated Learning
    Zheng, Jingheng
    Ni, Wanli
    Tian, Hui
    Jiang, Wenchao
    Quek, Tony Q. S.
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 2057 - 2062
  • [6] Convergence Analysis and Latency Minimization for Semi-Federated Learning in Massive IoT Networks
    Ren, Jianyang
    Ni, Wanli
    Tian, Hui
    Nie, Gaofeng
    IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, 2024, 8 (01): : 413 - 426
  • [7] Semi-Federated Learning: An Integrated Framework for Pervasive Intelligence in 6G Networks
    Zheng, Jingheng
    Ni, Wanli
    Tian, Hui
    Gunduz, Deniz
    Quek, Tony Q. S.
    IEEE INFOCOM 2022 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2022,
  • [8] Digital Twin-Assisted Semi-Federated Learning Framework for Industrial Edge Intelligence
    Wu, Xiongyue
    Tang, Jianhua
    Siew, Marie
    CHINA COMMUNICATIONS, 2024, 21 (05) : 314 - 329
  • [9] Accelerating Hybrid Federated Learning Convergence Under Partial Participation
    Bian, Jieming
    Wang, Lei
    Yang, Kun
    Shen, Cong
    Xu, Jie
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 3258 - 3271
  • [10] Differentially Private Federated Learning on Non-iid Data: Convergence Analysis and Adaptive Optimization
    Chen, Lin
    Ding, Xiaofeng
    Bao, Zhifeng
    Zhou, Pan
    Jin, Hai
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (09) : 4567 - 4581