A two-stage federated learning method for personalization via selective collaboration

被引:1
作者
Xu, Jiuyun [1 ]
Zhou, Liang [1 ]
Zhao, Yingzhi [1 ]
Li, Xiaowen [1 ]
Zhu, Kongshang [1 ]
Xu, Xiangrui [2 ]
Duan, Qiang [3 ]
Zhang, Ruru [4 ]
机构
[1] China Univ Petr East China, Qingdao Software Inst, Coll Comp Sci & Technol, Dongying, Shandong, Peoples R China
[2] Old Dominion Univ, Dept Comp Sci, Norfolk, VA 23529 USA
[3] Penn State Univ, Informat Sci & Technol Dept, Abington, PA 19001 USA
[4] China Mobile Suzhou Software Technol Co, 58 Kunshan Rd, Suzhou 215163, Jiangsu, Peoples R China
关键词
Federated learning; Personalization; Client selection; Hierarchical model fusion;
D O I
10.1016/j.comcom.2025.108053
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As an emerging distributed learning method, Federated learning has received much attention recently. Traditional federated learning aims to train a global model on a decentralized dataset, but in the case of uneven data distribution, a single global model may not be well adapted to each client, and even the local training performance of some clients maybe superior to the global model. Under this background, clustering resemblance clients into the same group is a common approach. However, there is still some heterogeneity of clients within the same group, and general clustering methods usually assume that clients belong to a specific class only, but in real-world scenarios, it is difficult to accurately categorize clients into one class due to the complexity of data distribution. To solve these problems, we propose a two-stage fed erated learning method for personalization via s elective c ollaboration (FedSC). Different from previous clustering methods, we focus on how to independently exclude other clients with significant distributional differences for each client and break the restriction that clients can only belong to one category. We tend to select collaborators for each client who are more conducive to achieving local mission goals and build a collaborative group for them independently, and every client engages in a federated learning process only with group members to avoid negative knowledge transfer. Furthermore, FedSC performs finer-grained processing within each group, using an adaptive hierarchical fusion strategy of group and local models instead of the traditional approach's scheme of directly overriding local models. Extensive experiments show that our proposed method considerably increases model performance under different heterogeneity scenarios.
引用
收藏
页数:10
相关论文
共 40 条
[1]   Federated learning with hierarchical clustering of local updates to improve training on non-IID data [J].
Briggs, Christopher ;
Fan, Zhong ;
Andras, Peter .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[2]   Federated learning of predictive models from federated Electronic Health Records [J].
Brisimi, Theodora S. ;
Chen, Ruidi ;
Mela, Theofanie ;
Olshevsky, Alex ;
Paschalidis, Ioannis Ch. ;
Shi, Wei .
INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, 2018, 112 :59-67
[3]  
Chen FW, 2022, Arxiv, DOI arXiv:2203.00829
[4]  
Cho Y.J., 2022, WORKSH FED LEARN REC
[5]  
Collins L, 2021, PR MACH LEARN RES, V139
[6]   Federated Learning-based Driver Activity Recognition for Edge Devices [J].
Doshi, Keval ;
Yilmaz, Yasin .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, :3337-3345
[7]   Flexible Clustered Federated Learning for Client-Level Data Distribution Shift [J].
Duan, Moming ;
Liu, Duo ;
Ji, Xinyuan ;
Wu, Yu ;
Liang, Liang ;
Chen, Xianzhang ;
Tan, Yujuan ;
Ren, Ao .
IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) :2661-2674
[8]   An Efficient Framework for Clustered Federated Learning [J].
Ghosh, Avishek ;
Chung, Jichan ;
Yin, Dong ;
Ramchandran, Kannan .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (12) :8076-8091
[9]  
Arivazhagan MG, 2019, Arxiv, DOI arXiv:1912.00818
[10]  
Hard A, 2019, Arxiv, DOI arXiv:1811.03604