FedTC: A Personalized Federated Learning Method with Two Classifiers

被引:2
作者
Liu, Yang [1 ,3 ]
Wang, Jiabo [1 ,2 ]
Liu, Qinbo [1 ]
Xu, Wanyin [1 ]
Gheisari, Mehdi [1 ]
Jiang, Zoe L. [1 ]
Zhang, Jiajia [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Shenzhen 518055, Peoples R China
[2] Guangdong Prov Key Lab Novel Secur Intelligence Te, Shenzhen 518055, Peoples R China
[3] Peng Cheng Lab, Res Ctr Cyberspace Secur, Shenzhen 518055, Peoples R China
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2023年 / 76卷 / 03期
关键词
Distributed machine learning; federated learning; data hetero-geneity; non-independent identically distributed;
D O I
10.32604/cmc.2023.039452
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Centralized training of deep learning models poses privacy risks that hinder their deployment. Federated learning (FL) has emerged as a solution to address these risks, allowing multiple clients to train deep learning models collaboratively without sharing raw data. However, FL is vulnerable to the impact of heterogeneous distributed data, which weakens convergence stability and suboptimal performance of the trained model on local data. This is due to the discarding of the old local model at each round of training, which results in the loss of personalized information in the model critical for maintaining model accuracy and ensuring robustness. In this paper, we propose FedTC, a personalized federated learning method with two classifiers that can retain personalized information in the local model and improve the model's performance on local data. FedTC divides the model into two parts, namely, the extractor and the classifier, where the classifier is the last layer of the model, and the extractor consists of other layers. The classifier in the local model is always retained to ensure that the personalized information is not lost. After receiving the global model, the local extractor is overwritten by the global model's extractor, and the classifier of the global model serves as an additional classifier of the local model to guide local training. The FedTC introduces a two-classifier training strategy to coordinate the two classifiers for local model updates. Experimental results on Cifar10 and Cifar100 datasets demonstrate that FedTC performs better on heterogeneous data than current studies, such as FedAvg, FedPer, and local training, achieving a maximum improvement of 27.95% in model classification test accuracy compared to FedAvg.
引用
收藏
页码:3013 / 3027
页数:15
相关论文
共 37 条
[1]  
Collins L, 2021, PR MACH LEARN RES, V139
[2]  
Dash B., 2022, Int J Softw Eng Appl (IJSEA), V13, P1, DOI DOI 10.5121/IJSEA.2022.13401
[3]  
Datoo A, 2018, COMPUT FRAUD SECUR, P17
[4]  
Deng YY, 2020, Arxiv, DOI arXiv:2003.13461
[5]  
Determann L., 2021, J DATA PROTECTION PR, V4, P235
[6]  
Fallah A, 2020, ADV NEUR IN, V33
[7]   FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling and Correction [J].
Gao, Liang ;
Fu, Huazhu ;
Li, Li ;
Chen, Yingwen ;
Xu, Ming ;
Xu, Cheng-Zhong .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, :10102-10111
[8]  
Arivazhagan MG, 2019, Arxiv, DOI arXiv:1912.00818
[9]   On Demand Fog Federations for Horizontal Federated Learning in IoV [J].
Hammoud, Ahmad ;
Otrok, Hadi ;
Mourad, Azzam ;
Dziong, Zbigniew .
IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2022, 19 (03) :3062-3075
[10]   Stable federated fog formation: An evolutionary game theoretical approach [J].
Hammoud, Ahmad ;
Otrok, Hadi ;
Mourad, Azzam ;
Dziong, Zbigniew .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2021, 124 :21-32