Personalized federated learning with global information fusion and local knowledge inheritance collaboration

被引:0
作者
Li, Hongjiao [1 ]
Xu, Jiayi [1 ]
Jin, Ming [1 ]
Yin, Anyang [1 ]
机构
[1] Shanghai Univ Elect Power, Dept Comp Sci & Engn, 1851 Hucheng Ring Rd, Shanghai 200120, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; Personalized; Meta-learning; Knowledge distillation;
D O I
10.1007/s11227-024-06529-4
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Traditional federated learning has shown mediocre performance on heterogeneous data, thus sparking increasing interest in personalized federated learning. Unlike traditional federated learning, which trains a single global consensual model, personalized federated learning allows for the provision of distinct models to different clients. However, existing federated learning algorithms solely optimize either unidirectionally at the server or client side, leading to a dilemma: "Should we prioritize the learned model's generic performance or its personalized performance?" In this paper, we demonstrate the feasibility of simultaneously addressing both aspects. Concretely, we propose a novel dual-duty framework. On the client side, personalized models are utilized to retain local knowledge and integrate global information, minimizing risks associated with each client's experience. On the server side, virtual sample generation approximates second-order gradients, embedding local class structures into the global model to enhance its generalization capability. Utilizing a dual optimization framework termed FedCo, we achieve parallelism of global universality and personalized performance. Finally, theoretical analysis and extensive experiments validate that FedCo surpasses previous solutions, achieving state-of-the-art performance for both general and personalized performance in a variety of heterogeneous data scenarios.
引用
收藏
页数:31
相关论文
共 50 条
[31]   Personalized Federated Learning Based on Bidirectional Knowledge Distillation for WiFi Gesture Recognition [J].
Geng, Huan ;
Deng, Dongshang ;
Zhang, Weidong ;
Ji, Ping ;
Wu, Xuangou .
ELECTRONICS, 2023, 12 (24)
[32]   PFedKD: Personalized Federated Learning via Knowledge Distillation Using Unlabeled Pseudo Data for Internet of Things [J].
Li, Hanxi ;
Chen, Guorong ;
Wang, Bin ;
Chen, Zheng ;
Zhu, Yongsheng ;
Hu, Fuqiang ;
Dai, Jiao ;
Wang, Wei .
IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (11) :16314-16324
[33]   Local Differential Privacy-Based Federated Learning under Personalized Settings [J].
Wu, Xia ;
Xu, Lei ;
Zhu, Liehuang .
APPLIED SCIENCES-BASEL, 2023, 13 (07)
[34]   PLFa-FL: Personalized Local Differential Privacy for Fair Federated Learning [J].
Cai, Hongyun ;
Zhang, Meiling ;
Wang, Shiyun ;
Zhao, Ao ;
Zhang, Yu .
PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, :2325-2332
[35]   Minimax Estimation for Personalized Federated Learning: An Alternative between FedAvg and Local Training? [J].
Chen, Shuxiao ;
Zheng, Qinqing ;
Long, Qi ;
Su, Weijie J. .
JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
[36]   Balancing the trade-off between global and personalized performance in federated learning [J].
Pan, Zibin ;
Li, Chi ;
Yu, Fangchen ;
Wang, Shuyi ;
Tang, Xiaoying ;
Zhao, Junhua .
INFORMATION SCIENCES, 2025, 712
[37]   Bilateral Improvement in Local Personalization and Global Generalization in Federated Learning [J].
Wang, Yansong ;
Xu, Hui ;
Ali, Waqar ;
Zhou, Xiangmin ;
Shao, Jie .
IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (16) :27099-27111
[38]   FedCP: Separating Feature Information for Personalized Federated Learning via Conditional Policy [J].
Zhang, Jianqing ;
Hua, Yang ;
Wang, Hao ;
Song, Tao ;
Xue, Zhengui ;
Ma, Ruhui ;
Guan, Haibing .
PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, :3249-3261
[39]   A personalized federated cloud-edge collaboration framework via cross-client knowledge distillation [J].
Zhang, Shining ;
Wang, Xingwei ;
Zeng, Rongfei ;
Zeng, Chao ;
Li, Ying ;
Huang, Min .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 165
[40]   A novel staged training strategy leveraging knowledge distillation and model fusion for heterogeneous federated learning [J].
Wang, Debao ;
Guan, Shaopeng ;
Sun, Ruikang .
JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2025, 236