Federated learning with joint server-client momentum

被引:0
作者
Li, Boyuan [1 ,2 ]
Zhang, Shaohui [1 ,3 ]
Han, Qiuying [4 ]
机构
[1] Zhengzhou Univ, Sch Comp Sci & Artificial Intelligence, Zhengzhou 450001, Peoples R China
[2] Longmen Lab, Innovat Ctr Intelligent Syst, Luoyang 471000, Peoples R China
[3] Zhoukou Normal Univ, Sch Artificial Intelligence, Zhoukou 466001, Peoples R China
[4] Zhoukou Normal Univ, Sch Comp Sci & Technol, Zhoukou 466001, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated Learning; Data Heterogeneity; Edge Computing; Internet of Things; Distributed Learning; CHALLENGES;
D O I
10.1038/s41598-025-99385-y
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Federated Learning, an approach to collaborative modeling, enables the training of a unified global model across multiple clients in a decentralized manner. However, the considerable impact of local data heterogeneity on algorithm performance has attracted significant attention. In this study, we introduce a novel Federated Learning algorithm called Federated Joint Server-Client Momentum (FedJSCM) to address data heterogeneity in real-world Federated Learning applications. FedJSCM efficiently utilizes global gradient information from previous communications and adjusts client gradient descent and server model fusion by transmitting gradient momentum information. This corrective mechanism effectively mitigates biases and improves the stability of Stochastic Gradient Descent (SGD). We offer theoretical analysis to highlight the advantages of FedJSCM and conduct extensive empirical studies, showcasing its superior performance across various tasks and its robustness in the face of varying degrees of data heterogeneity. Empirical studies demonstrate that FedJSCM outperforms existing algorithms, with a 1-3% accuracy increase.
引用
收藏
页数:19
相关论文
共 38 条
[1]   Oriented stochastic loss descent algorithm to train very deep multi-layer neural networks without vanishing gradients [J].
Abuqaddom, Inas ;
Mahafzah, Basel A. ;
Faris, Hossam .
KNOWLEDGE-BASED SYSTEMS, 2021, 230
[2]  
Acar DAE, 2021, Arxiv, DOI [arXiv:2111.04263, 10.48550/ARXIV.2111.04263]
[3]  
Al-Shedivat M, 2021, Arxiv, DOI arXiv:2010.05273
[4]  
An ZY, 2024, AAAI CONF ARTIF INTE, P10882
[5]  
Di YC, 2025, IEEE T CONSUM ELECTR, P1, DOI [10.1109/tce.2025.3526427, 10.1109/TCE.2025.3526427, DOI 10.1109/TCE.2025.3526427]
[6]   Federated Recommender System Based on Diffusion Augmentation and Guided Denoising [J].
Di, Yicheng ;
Shi, Hongjian ;
Wang, Xiaoming ;
Ma, Ruhui ;
Liu, Yuan .
ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2025, 43 (02)
[7]  
Di YC, 2024, ACM Transactions on Recommender Systems, DOI [10.1145/3682076, 10.1145/3682076, DOI 10.1145/3682076]
[8]  
Fan Z., 2024, 41 INT C MACH LEARN
[9]  
Gao L., 2022, arXiv, DOI DOI 10.48550/ARXIV.2203.11751
[10]   Adaptive Personalized Federated Learning With One-Shot Screening [J].
Ge, Yang ;
Zhou, Yang ;
Jia, Li .
IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (09) :15375-15385