Personalized Federated Learning with Progressive Local Training Strategy and Lightweight Classifier

被引:0
作者
Liu, Jianhao [1 ,2 ]
Gong, Wenjuan [1 ,2 ]
Fang, Ziyi [1 ]
Gonzalez, Jordi [3 ]
Rodrigues, Joel [4 ]
机构
[1] China Univ Petr East China, Qingdao Inst Software, Coll Comp Sci & Technol, Qingdao 266580, Peoples R China
[2] Shandong Key Lab Intelligent Oil & Gas Ind Softwar, Qingdao 266580, Peoples R China
[3] Univ Autonoma Barcelona, Comp Vis Ctr, Barcelona 08193, Spain
[4] Amazonas State Univ, Higher Sch Technol, BR-69000 Manaus, Brazil
来源
APPLIED SCIENCES-BASEL | 2025年 / 15卷 / 05期
基金
中国国家自然科学基金;
关键词
federatedlearning; personalized federated learning; data heterogeneity; catastrophic forgetting;
D O I
10.3390/app15052481
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Data heterogeneity poses a significant challenge in federated learning (FL), which has become a central focus of contemporary research in artificial intelligence. Personalized federated learning (pFL), a specialized branch of FL, seeks to address this issue by tailoring models to the unique data distributions of individual clients. Despite its potential, current pFL frameworks face critical limitations, particularly in handling client training discontinuity. When clients are unable to engage in every training round, the resulting models tend to diverge from their local knowledge, leading to catastrophic forgetting. Moreover, existing frameworks often separate the model from the local classifier used for personalization, keeping the classifier local for extended periods. This inherent characteristic of classifiers frequently leads to overfitting on local training data, thereby impairing the generalization capability of the local models. To tackle these challenges, we propose a novel personalized federated learning framework, PFPS-LWC (Personalized Federated Learning with a Progressive Local Training Strategy and a Lightweight Classifier). Our approach introduces local knowledge recall and employs regularized classifiers to mitigate the effects of local knowledge forgetting and enhance the generalization of the models. We evaluated the performance of PFPS-LWC under varying degrees of data heterogeneity using the CIFAR10 and CIFAR100 datasets. Our method outperformed the state-of-the-art approach by up to 4.22% and consistently achieved the best performance across various heterogeneous environments, further demonstrating its effectiveness and robustness.
引用
收藏
页数:26
相关论文
共 47 条
[1]  
Acar DAE, 2021, Arxiv, DOI arXiv:2111.04263
[2]   Federated Learning for Healthcare: Systematic Review and Architecture Proposal [J].
Antunes, Rodolfo Stoffel ;
da Costa, Cristiano Andre ;
Kuederle, Arne ;
Yari, Imrana Abdullahi ;
Eskofier, Bjoern .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
[3]  
Chen HY, 2022, Arxiv, DOI arXiv:2107.00778
[4]  
Collins L, 2021, PR MACH LEARN RES, V139
[5]  
Dinh CT, 2020, ADV NEUR IN, V33
[6]   FedDrive: Generalizing Federated Learning to Semantic Segmentation in Autonomous Driving [J].
Fantauzzo, Lidia ;
Fani, Eros ;
Caldarola, Debora ;
Tavera, Antonio ;
Cermelli, Fabio ;
Ciccone, Marco ;
Caputo, Barbara .
2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, :11504-11511
[7]   Privacy-Preserving Individual-Level COVID-19 Infection Prediction via Federated Graph Learning [J].
Fu, Wenjie ;
Wang, Huandong ;
Gao, Chen ;
Liu, Guanghua ;
Li, Yong ;
Jiang, Tao .
ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (03)
[8]   FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling and Correction [J].
Gao, Liang ;
Fu, Huazhu ;
Li, Li ;
Chen, Yingwen ;
Xu, Ming ;
Xu, Cheng-Zhong .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, :10102-10111
[9]  
Arivazhagan MG, 2019, Arxiv, DOI arXiv:1912.00818
[10]  
Goodfellow IJ, 2014, ADV NEUR IN, V27, P2672