Cooperative Multi-Model Training for Personalized Federated Learning Over Heterogeneous Devices

被引:0
|
作者
Xu, Jian [1 ]
Wan, Shuo [2 ]
Li, Yinchuan [2 ]
Luo, Sichun [3 ]
Chen, Zhilin [2 ]
Shao, Yunfeng [2 ]
Chen, Zhitang [2 ]
Huang, Shao-Lun [1 ]
Song, Linqi [3 ]
机构
[1] Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Beijing 100190, Peoples R China
[2] Huawei Technol, Noahs Ark Lab, Beijing 100085, Peoples R China
[3] City Univ Hong Kong, Shenzhen Res Inst, Hong Kong, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Training; Performance evaluation; Data privacy; Computer vision; Federated learning; Computational modeling; Computer architecture; Benchmark testing; Feature extraction; Data models; heterogeneous devices; multiple models; label shift; personalization;
D O I
10.1109/JSTSP.2024.3497660
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) is an increasingly popular paradigm for protecting data privacy in machine learning systems. However, the data heterogeneity and high computation cost/latency are challenging barriers for employing FL in real-world applications with heterogeneous devices. In this paper, we propose a novel personalized FL framework named CompFL allowing cooperative training of models with varied structures to mitigate those issues. First, CompFL initializes a set of expert models in varied sizes and allows each client to choose one or multiple expert models for training according to their capacities. Second, CompFL combines the model decoupling strategy and local-global feature alignment to mitigate the adverse impact of label heterogeneity, where clients only share the feature extractor part for each model architecture. Third, to encourage mutual enhancement of various models, knowledge distillation in local training is further applied to improve the overall performance. To make our framework workable in real systems, we implement it in both centralized settings with server-coordinated parallel training, and decentralized settings with newly developed device-to-device training-forwarding schemes. Extensive experiments on benchmark datasets are conducted to verify the potential of our framework for personalized FL over heterogeneous devices.
引用
收藏
页码:195 / 207
页数:13
相关论文
共 50 条
  • [1] Robust Multi-model Personalized Federated Learning via Model Distillation
    Muhammad, Adil
    Lin, Kai
    Gao, Jian
    Chen, Bincai
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 432 - 446
  • [2] An Efficient Multi-Model Training Algorithm for Federated Learning
    Li, Cong
    Li, Chunxi
    Zhao, Yongxiang
    Zhang, Baoxian
    Li, Cheng
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [3] Evolutionary Multi-model Federated Learning on Malicious and Heterogeneous Data
    Shang, Chikai
    Gu, Fangqing
    Jiang, Jiaqi
    2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 386 - 395
  • [4] Uplink Over-the-Air Aggregation for Multi-Model Wireless Federated Learning
    Zhang, Chong
    Dong, Min
    Liang, Ben
    Afanat, Ali
    Ahmed, Yahia
    2024 IEEE 25TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS, SPAWC 2024, 2024, : 36 - 40
  • [5] Towards Personalized Federated Learning via Heterogeneous Model Reassembly
    Wang, Jiaqi
    Yang, Xingyi
    Cui, Suhan
    Che, Liwei
    Lyu, Lingjuan
    Xu, Dongkuan
    Ma, Fenglong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [6] Asynchronous Multi-Model Dynamic Federated Learning Over Wireless Networks: Theory, Modeling, and Optimization
    Chang, Zhan-Lun
    Hosseinalipour, Seyyedali
    Chiang, Mung
    Brinton, Christopher G.
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2024, 10 (05) : 1989 - 2004
  • [7] Toward Cooperative Federated Learning Over Heterogeneous Edge/Fog Networks
    Wang, Su
    Hosseinalipour, Seyyedali
    Aggarwal, Vaneet
    Brinton, Christopher G.
    Love, David J.
    Su, Weifeng
    Chiang, Mung
    IEEE COMMUNICATIONS MAGAZINE, 2023, 61 (12) : 54 - 60
  • [8] MULTI-MODEL FEDERATED LEARNING OPTIMIZATION BASED ON MULTI-AGENT REINFORCEMENT LEARNING
    Atapour, S. Kaveh
    Seyedmohammadi, S. Jamal
    Sheikholeslami, S. Mohammad
    Abouei, Jamshid
    Mohammadi, Arash
    Plataniotis, Konstantinos N.
    2023 IEEE 9TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, CAMSAP, 2023, : 151 - 155
  • [9] Accelerating Wireless Federated Learning With Adaptive Scheduling Over Heterogeneous Devices
    Li, Yixuan
    Qin, Xiaoqi
    Han, Kaifeng
    Ma, Nan
    Xu, Xiaodong
    Zhang, Ping
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (02) : 2286 - 2302
  • [10] Joint Participant Selection and Learning Scheduling for Multi-Model Federated Edge Learning
    Wei, Xinliang
    Liu, Jiyao
    Wang, Yu
    2022 IEEE 19TH INTERNATIONAL CONFERENCE ON MOBILE AD HOC AND SMART SYSTEMS (MASS 2022), 2022, : 537 - 545