Personalized federated learning: A Clustered Distributed Co-Meta-Learning approach

被引:8
作者
Ren, Maoye [3 ]
Wang, Zhe [1 ,2 ]
Yu, Xinhai [3 ]
机构
[1] Minist Educ, Key Lab Smart Mfg Energy Chem Proc, Shanghai, Peoples R China
[2] East China Univ Sci & Technol, Dept Comp Sci & Engn, Shanghai 200237, Peoples R China
[3] East China Univ Sci & Technol, Dept Mech & Power Engn, Shanghai 200237, Peoples R China
关键词
Federated learning; Distributed Co-Meta-Learning; Personalized federated learning; Efficient and effective; Few-shot learning; LOCAL-SGD;
D O I
10.1016/j.ins.2023.119499
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) aims to train a model across multiple parties while preserving the privacy of users' data. Traditional FL only develops a common model for users, and does not adapt the model to each user. Therefore, personalized FL approaches emerged that can further adapt the model to users, thus showing better performance. Among these personalized FL methods, meta-learned personalized FL methods achieve the advanced performance. However, this personalization scheme adapts model to each user according to their own data, and the features it learned are not enough and not rich, especially when there are extremely little data in some users. In this paper, we study a more effective variant of personalization federated learning. We first formalize a new learning problem and propose a Distributed Co-Meta-Learning approach for this learning problem. Then, we show how to design a new personalized FL framework based on this Distributed Co-Meta-Learning approach. To optimize our proposed personalized FL framework, while reducing the computational cost in the optimization, we study a chainestimation aggregation method for our framework. It also reduces the computational load in the clients. Further, we give the theoretical convergence analysis of our method on the most complex case, non-convex and non-IID problems, and analyze some parameters' properties within it. Experiments demonstrate that our method achieves the state-of-the-art performance in the personalization FL area.
引用
收藏
页数:23
相关论文
共 45 条
[1]  
Agarwal N, 2018, ADV NEUR IN, V31
[2]   Qsparse-Local-SGD: Distributed SGD With Quantization, Sparsification, and Local Computations [J].
Basu, Debraj ;
Data, Deepesh ;
Karakus, Can ;
Diggavi, Suhas N. .
IEEE JOURNAL ON SELECTED AREAS IN INFORMATION THEORY, 2020, 1 (01) :217-226
[3]   Communication-Efficient and Model-Heterogeneous Personalized Federated Learning via Clustered Knowledge Transfer [J].
Cho, Yae Jee ;
Wang, Jianyu ;
Chirvolu, Tarun ;
Joshi, Gauri .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2023, 17 (01) :234-247
[4]  
Fallah A, 2020, ADV NEUR IN, V33
[5]  
Finn C, 2017, PR MACH LEARN RES, V70
[6]  
Arivazhagan MG, 2019, Arxiv, DOI arXiv:1912.00818
[7]  
Hanzely F, 2021, Arxiv, DOI arXiv:2002.05516
[8]  
Ion M., 2019, 2019723 CRYPT EPRINT, V2019, P723
[9]   Advances and Open Problems in Federated Learning [J].
Kairouz, Peter ;
McMahan, H. Brendan ;
Avent, Brendan ;
Bellet, Aurelien ;
Bennis, Mehdi ;
Bhagoji, Arjun Nitin ;
Bonawitz, Kallista ;
Charles, Zachary ;
Cormode, Graham ;
Cummings, Rachel ;
D'Oliveira, Rafael G. L. ;
Eichner, Hubert ;
El Rouayheb, Salim ;
Evans, David ;
Gardner, Josh ;
Garrett, Zachary ;
Gascon, Adria ;
Ghazi, Badih ;
Gibbons, Phillip B. ;
Gruteser, Marco ;
Harchaoui, Zaid ;
He, Chaoyang ;
He, Lie ;
Huo, Zhouyuan ;
Hutchinson, Ben ;
Hsu, Justin ;
Jaggi, Martin ;
Javidi, Tara ;
Joshi, Gauri ;
Khodak, Mikhail ;
Konecny, Jakub ;
Korolova, Aleksandra ;
Koushanfar, Farinaz ;
Koyejo, Sanmi ;
Lepoint, Tancrede ;
Liu, Yang ;
Mittal, Prateek ;
Mohri, Mehryar ;
Nock, Richard ;
Ozgur, Ayfer ;
Pagh, Rasmus ;
Qi, Hang ;
Ramage, Daniel ;
Raskar, Ramesh ;
Raykova, Mariana ;
Song, Dawn ;
Song, Weikang ;
Stich, Sebastian U. ;
Sun, Ziteng ;
Suresh, Ananda Theertha .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2021, 14 (1-2) :1-210
[10]  
Khaled A, 2020, PR MACH LEARN RES, V108, P4519