Cooperative Multi-Model Training for Personalized Federated Learning Over Heterogeneous Devices

被引:0
|
作者
Xu, Jian [1 ]
Wan, Shuo [2 ]
Li, Yinchuan [2 ]
Luo, Sichun [3 ]
Chen, Zhilin [2 ]
Shao, Yunfeng [2 ]
Chen, Zhitang [2 ]
Huang, Shao-Lun [1 ]
Song, Linqi [3 ]
机构
[1] Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Beijing 100190, Peoples R China
[2] Huawei Technol, Noahs Ark Lab, Beijing 100085, Peoples R China
[3] City Univ Hong Kong, Shenzhen Res Inst, Hong Kong, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Training; Performance evaluation; Data privacy; Computer vision; Federated learning; Computational modeling; Computer architecture; Benchmark testing; Feature extraction; Data models; heterogeneous devices; multiple models; label shift; personalization;
D O I
10.1109/JSTSP.2024.3497660
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) is an increasingly popular paradigm for protecting data privacy in machine learning systems. However, the data heterogeneity and high computation cost/latency are challenging barriers for employing FL in real-world applications with heterogeneous devices. In this paper, we propose a novel personalized FL framework named CompFL allowing cooperative training of models with varied structures to mitigate those issues. First, CompFL initializes a set of expert models in varied sizes and allows each client to choose one or multiple expert models for training according to their capacities. Second, CompFL combines the model decoupling strategy and local-global feature alignment to mitigate the adverse impact of label heterogeneity, where clients only share the feature extractor part for each model architecture. Third, to encourage mutual enhancement of various models, knowledge distillation in local training is further applied to improve the overall performance. To make our framework workable in real systems, we implement it in both centralized settings with server-coordinated parallel training, and decentralized settings with newly developed device-to-device training-forwarding schemes. Extensive experiments on benchmark datasets are conducted to verify the potential of our framework for personalized FL over heterogeneous devices.
引用
收藏
页码:195 / 207
页数:13
相关论文
共 50 条
  • [31] Over-the-Air Hierarchical Personalized Federated Learning
    Zhou, Fangtong
    Wang, Zhibin
    Shan, Hangguan
    Wu, Liantao
    Tian, Xiaohua
    Shi, Yuanming
    Zhou, Yong
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2025, 74 (03) : 5006 - 5021
  • [32] Insights into Multi-Model Federated Learning: An Advanced Approach for Air Quality Index Forecasting
    Le, Duy-Dong
    Tran, Anh-Khoa
    Dao, Minh-Son
    Nguyen-Ly, Kieu-Chinh
    Le, Hoang-Son
    Nguyen-Thi, Xuan-Dao
    Pham, Thanh-Qui
    Nguyen, Van-Luong
    Nguyen-Thi, Bach-Yen
    ALGORITHMS, 2022, 15 (11)
  • [33] A lightweight and personalized edge federated learning model
    Peiyan Yuan
    Ling Shi
    Xiaoyan Zhao
    Junna Zhang
    Complex & Intelligent Systems, 2024, 10 : 3577 - 3592
  • [34] A lightweight and personalized edge federated learning model
    Yuan, Peiyan
    Shi, Ling
    Zhao, Xiaoyan
    Zhang, Junna
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (03) : 3577 - 3592
  • [35] Mitigating Group Bias in Federated Learning for Heterogeneous Devices
    Selialia, Khotso
    Chandio, Yasra
    Anwar, Fatima M.
    PROCEEDINGS OF THE 2024 ACM CONFERENCE ON FAIRNESS, ACCOUNTABILITY, AND TRANSPARENCY, ACM FACCT 2024, 2024, : 1043 - 1054
  • [36] Optimal Task Assignment for Heterogeneous Federated Learning Devices
    Pilla, Laercio Lima
    2021 IEEE 35TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS), 2021, : 661 - 670
  • [37] Debiasing Model Updates for Improving Personalized Federated Training
    Acar, Durmus Alp Emre
    Zhao, Yue
    Zhu, Ruizhao
    Navarro, Ramon Matas
    Mattina, Matthew
    Whatmough, Paul N.
    Saligrama, Venkatesh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [38] Federated Learning for Computationally Constrained Heterogeneous Devices: A Survey
    Pfeiffer, Kilian
    Rapp, Martin
    Khalili, Ramin
    Henkel, Joerg
    ACM COMPUTING SURVEYS, 2023, 55 (14S)
  • [39] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [40] AdaptFL: Adaptive Federated Learning Framework for Heterogeneous Devices
    Zhang, Yingqi
    Xia, Hui
    Xu, Shuo
    Wang, Xiangxiang
    Xu, Lijuan
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 165