Bilateral Improvement in Local Personalization and Global Generalization in Federated Learning

被引:0
|
作者
Wang, Yansong [1 ]
Xu, Hui [1 ,2 ]
Ali, Waqar [2 ]
Zhou, Xiangmin [3 ]
Shao, Jie
机构
[1] Univ Elect Sci & Technol China, Shenzhen Inst Adv Study, Shenzhen 518110, Peoples R China
[2] Sichuan Artificial Intelligence Res Inst, Yibin 644000, Peoples R China
[3] RMIT Univ, Sch Comp Technol, Melbourne, VIC 3000, Australia
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 16期
基金
中国国家自然科学基金;
关键词
Training; Servers; Data models; Federated learning; Adaptation models; Internet of Things; Synchronization; Cosine similarity; federated learning (FL); fine tuning; personalized FL (PFL);
D O I
10.1109/JIOT.2024.3399074
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a machine learning paradigm where a server trains a global model by amalgamating contributions from multiple clients, without accessing personal client data directly. Personalized FL (PFL), a specific subset of this domain, shifts focus from a global model to providing personalized models for each client. This difference in training objectives signifies that while conventional FL aims for optimal generalization at the server level, PFL focuses on the client-side model personalization. Often, achieving both generalization and personalization in a model is challenging. In response, we introduce FedCACS, a classifier aggregation with cosine similarity in the FL method to bridge the gap between the conventional FL and PFL. On the one hand, FedCACS adopts cosine similarity and a new PFL training strategy, which enhances the personalization ability of the local model on the client and enables the model to learn more compact image representation. On the other hand, FedCACS uses a classifier aggregation module to aggregate personalized classifiers from each client to restore the generalization ability of the global model. Experiments on the public data sets affirm the effectiveness of FedCACS in personalization, generalization ability, and fast adaptation.
引用
收藏
页码:27099 / 27111
页数:13
相关论文
共 50 条
  • [31] Dynamic Adaptive Federated Learning on Local Long-Tailed Data
    Pu, Juncheng
    Fu, Xiaodong
    Dong, Hai
    Zhang, Pengcheng
    Liu, Li
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (06) : 3485 - 3498
  • [32] Tailored Federated Learning With Adaptive Central Acceleration on Diversified Global Models
    Zhao, Lei
    Cai, Lin
    Lu, Wu-Sheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [33] Federated Split Learning With Joint Personalization-Generalization for Inference-Stage Optimization in Wireless Edge Networks
    Han, Dong-Jun
    Kim, Do-Yeon
    Choi, Minseok
    Nickel, David
    Moon, Jaekyun
    Chiang, Mung
    Brinton, Christopher G.
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (06) : 7048 - 7065
  • [34] Wireless Federated Learning With Hybrid Local and Centralized Training: A Latency Minimization Design
    Huang, Ning
    Dai, Minghui
    Wu, Yuan
    Quek, Tony Q. S.
    Shen, Xuemin
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2023, 17 (01) : 248 - 263
  • [35] FAST: Enhancing Federated Learning Through Adaptive Data Sampling and Local Training
    Wang, Zhiyuan
    Xu, Hongli
    Xu, Yang
    Jiang, Zhida
    Liu, Jianchun
    Chen, Suo
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2024, 35 (02) : 221 - 236
  • [36] BADFL: Backdoor Attack Defense in Federated Learning From Local Model Perspective
    Zhang, Haiyan
    Li, Xinghua
    Xu, Mengfan
    Liu, Ximeng
    Wu, Tong
    Weng, Jian
    Deng, Robert H.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 5661 - 5674
  • [37] Sine: Similarity is Not Enough for Mitigating Local Model Poisoning Attacks in Federated Learning
    Kasyap, Harsh
    Tripathy, Somanath
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (05) : 4481 - 4494
  • [38] Self-Balancing Federated Learning With Global Imbalanced Data in Mobile Systems
    Duan, Moming
    Liu, Duo
    Chen, Xianzhang
    Liu, Renping
    Tan, Yujuan
    Liang, Liang
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2021, 32 (01) : 59 - 71
  • [39] Securing a Local Training Dataset Size in Federated Learning
    Shin, Young Ah
    Noh, Geontae
    Jeong, Ik Rae
    Chun, Ji Young
    IEEE ACCESS, 2022, 10 : 104135 - 104143
  • [40] Federated Learning Approach Decouples Clients From Training a Local Model and With the Communication With the Server
    Stergiou, Konstantinos D.
    Psannis, Konstantinos E.
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2022, 19 (04): : 4213 - 4218