Knowledge-Aware Parameter Coaching for Personalized Federated Learning

被引:0
作者
Zhi, Mingjian [1 ]
Bi, Yuanguo [1 ]
Xu, Wenchao [2 ]
Wang, Haozhao [3 ]
Xiang, Tianao [1 ]
机构
[1] Northeastern Univ, Shenyang, Peoples R China
[2] Hong Kong Polytech Univ, Hong Kong, Peoples R China
[3] Huazhong Univ Sci & Technol, Wuhan, Peoples R China
来源
THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15 | 2024年
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Personalized Federated Learning (pFL) can effectively exploit the non-IID data from distributed clients by customizing personalized models. Existing pFL methods either simply take the local model as a whole for aggregation or require significant training overhead to induce the inter-client personalized weights, and thus clients cannot efficiently exploit the mutually relevant knowledge from each other. In this paper, we propose a knowledge-aware parameter coaching scheme where each client can swiftly and granularly refer to parameters of other clients to guide the local training, whereby accurate personalized client models can be efficiently produced without contradictory knowledge. Specifically, a novel regularizer is designed to conduct layer-wise parameters coaching via a relation cube, which is constructed based on the knowledge represented by the layered parameters among all clients. Then, we develop an optimization method to update the relation cube and the parameters of each client. It is theoretically demonstrated that the convergence of the proposed method can be guaranteed under both convex and non-convex settings. Extensive experiments are conducted over various datasets, which show that the proposed method can achieve better performance compared with the state-of-the-art baselines in terms of accuracy and convergence speed.
引用
收藏
页码:17069 / 17077
页数:9
相关论文
共 41 条
  • [1] Acar D. A. E., 2021, P 9 INT C LEARN REPR, P1
  • [2] ON THE CONVERGENCE OF BLOCK COORDINATE DESCENT TYPE METHODS
    Beck, Amir
    Tetruashvili, Luba
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (04) : 2037 - 2060
  • [3] Proximal alternating linearized minimization for nonconvex and nonsmooth problems
    Bolte, Jerome
    Sabach, Shoham
    Teboulle, Marc
    [J]. MATHEMATICAL PROGRAMMING, 2014, 146 (1-2) : 459 - 494
  • [4] Chen YQ, 2022, Arxiv, DOI arXiv:2206.08516
  • [5] Collins L, 2021, PR MACH LEARN RES, V139
  • [6] Dinh CT, 2020, ADV NEUR IN, V33
  • [7] Flexible Clustered Federated Learning for Client-Level Data Distribution Shift
    Duan, Moming
    Liu, Duo
    Ji, Xinyuan
    Wu, Yu
    Liang, Liang
    Chen, Xianzhang
    Tan, Yujuan
    Ren, Ao
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2661 - 2674
  • [8] Fallah A, 2020, ADV NEUR IN, V33
  • [9] Arivazhagan MG, 2019, Arxiv, DOI arXiv:1912.00818
  • [10] Hsu TMH, 2019, Arxiv, DOI arXiv:1909.06335