SplitFed: When Federated Learning Meets Split Learning

被引:0
|
作者
Thapa, Chandra [1 ]
Arachchige, Pathum Chamikara Mahawaga [1 ]
Camtepe, Seyit [1 ]
Sun, Lichao [2 ]
机构
[1] CSIRO Data61, Sydney, NSW, Australia
[2] Lehigh Univ, Bethlehem, PA 18015 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) and split learning (SL) are two popular distributed machine learning approaches. Both follow a model-to-data scenario; clients train and test machine learning models without sharing raw data. SL provides better model privacy than FL due to the machine learning model architecture split between clients and the server. Moreover, the split model makes SL a better option for resource-constrained environments. However, SL performs slower than FL due to the relay-based training across multiple clients. In this regard, this paper presents a novel approach, named splitfed learning (SFL), that amalgamates the two approaches eliminating their inherent drawbacks, along with a refined architectural configuration incorporating differential privacy and PixelDP to enhance data privacy and model robustness. Our analysis and empirical results demonstrate that (pure) SFL provides similar test accuracy and communication efficiency as SL while significantly decreasing its computation time per global epoch than in SL for multiple clients. Furthermore, as in SL, its communication efficiency over FL improves with the number of clients. Besides, the performance of SFL with privacy and robustness measures is further evaluated under extended experimental settings.
引用
收藏
页码:8485 / 8493
页数:9
相关论文
共 50 条
  • [1] FedELR: When federated learning meets learning with noisy labels
    Pu, Ruizhi
    Yu, Lixing
    Xu, Gezheng
    Zhou, Fan
    Ling, Charles X.
    Wang, Boyu
    NEURAL NETWORKS, 2025, 187
  • [2] When Blockchain Meets Asynchronous Federated Learning
    Jing, Rui
    Chen, Wei
    Wu, Xiaoxin
    Wang, Zehua
    Tian, Zijian
    Zhang, Fan
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IX, ICIC 2024, 2024, 14870 : 199 - 207
  • [3] WHEN FEDERATED LEARNING MEETS KNOWLEDGE DISTILLATION
    Pang, Xiaoyi
    Hu, Jiahui
    Sun, Peng
    Ren, Ju
    Wang, Zhibo
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 208 - 214
  • [4] When Decentralized Optimization Meets Federated Learning
    Gao, Hongchang
    Thai, My T.
    Wu, Jie
    IEEE NETWORK, 2023, 37 (05): : 233 - 239
  • [5] When Federated Learning Meets Blockchain: A New Distributed Learning Paradigm
    Ma, Chuan
    Li, Jun
    Shi, Long
    Ding, Ming
    Wang, Taotao
    Han, Zhu
    Poor, H. Vincent
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2022, 17 (03) : 26 - 33
  • [6] Hybrid Learning: When Centralized Learning Meets Federated Learning in the Mobile Edge Computing Systems
    Feng, Chenyuan
    Yang, Howard H.
    Wang, Siye
    Zhao, Zhongyuan
    Quek, Tony Q. S.
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2023, 71 (12) : 7008 - 7022
  • [7] Integer Is Enough: When Vertical Federated Learning Meets Rounding
    Qiu, Pengyu
    Pu, Yuwen
    Liu, Yongchao
    Liu, Wenyan
    Yue, Yun
    Zhu, Xiaowei
    Li, Lichun
    Li, Jinbao
    Ji, Shouling
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 13, 2024, : 14704 - 14712
  • [8] When Federated Learning Meets Privacy-Preserving Computation
    Chen, Jingxue
    Yan, Hang
    Liu, Zhiyuan
    Zhang, Min
    Xiong, Hu
    Yu, Shui
    ACM COMPUTING SURVEYS, 2024, 56 (12)
  • [9] FEDBERT: When Federated Learning Meets Pre-training
    Tian, Yuanyishu
    Wan, Yao
    Lyu, Lingjuan
    Yao, Dezhong
    Jin, Hai
    Sun, Lichao
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
  • [10] Belt and Braces: When Federated Learning Meets Differential Privacy
    Ren, Xuebin
    Yang, Shusen
    Zhao, Cong
    Mccann, Julie
    Xu, Zongben
    COMMUNICATIONS OF THE ACM, 2024, 67 (12) : 66 - 77