Personalized Federated Learning with Parameter Propagation

被引:5
作者
Wu, Jun [1 ]
Bao, Wenxuan [1 ]
Ainsworth, Elizabeth [2 ]
He, Jingrui [1 ]
机构
[1] Univ Illinois, Chicago, IL 60680 USA
[2] Univ Illinois, USDA ARS, Global Change & Photosynth Res Unit, Chicago, IL 60680 USA
来源
PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023 | 2023年
基金
美国食品与农业研究所; 美国国家科学基金会;
关键词
federated learning; parameter propagation; negative transfer;
D O I
10.1145/3580305.3599464
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With decentralized data collected from diverse clients, a personalized federated learning paradigm has been proposed for training machine learning models without exchanging raw data from local clients. We dive into personalized federated learning from the perspective of privacy-preserving transfer learning, and identify the limitations of previous personalized federated learning algorithms. First, previous works suffer from negative knowledge transferability for some clients, when focusing more on the overall performance of all clients. Second, high communication costs are required to explicitly learn statistical task relatedness among clients. Third, it is computationally expensive to generalize the learned knowledge from experienced clients to new clients. To solve these problems, in this paper, we propose a novel federated parameter propagation (FEDORA) framework for personalized federated learning. Specifically, we reformulate the standard personalized federated learning as a privacy-preserving transfer learning problem, with the goal of improving the generalization performance for every client. The crucial idea behind FEDORA is to learn how to transfer and whether to transfer simultaneously, including (1) adaptive parameter propagation: one client is enforced to adaptively propagate its parameters to others based on their task relatedness (e.g., explicitly measured by distribution similarity), and (2) selective regularization: each client would regularize its local personalized model with received parameters, only when those parameters are positively correlated with the generalization performance of its local model. The experiments on a variety of federated learning benchmarks demonstrate the effectiveness of the proposed FEDORA framework over state-of-the-art personalized federated learning baselines.
引用
收藏
页码:2594 / 2605
页数:12
相关论文
共 49 条
  • [1] [Anonymous], 2022, P 28 ACM SIGKDD C KN, DOI DOI 10.1109/BDICN55575.2022.00015
  • [2] Arivazhagan M. G., 2019, arXiv
  • [3] A theory of learning from different domains
    Ben-David, Shai
    Blitzer, John
    Crammer, Koby
    Kulesza, Alex
    Pereira, Fernando
    Vaughan, Jennifer Wortman
    [J]. MACHINE LEARNING, 2010, 79 (1-2) : 151 - 175
  • [4] FL-QSAR: a federated learning-based QSAR prototype for collaborative drug discovery
    Chen, Shaoqi
    Xue, Dongyu
    Chuai, Guohui
    Yang, Qiang
    Liu, Qi
    [J]. BIOINFORMATICS, 2020, 36 (22-23) : 5492 - 5498
  • [5] FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare
    Chen, Yiqiang
    Qin, Xin
    Wang, Jindong
    Yu, Chaohui
    Gao, Wen
    [J]. IEEE INTELLIGENT SYSTEMS, 2020, 35 (04) : 83 - 93
  • [6] Cho Yae Jee, 2022, ARXIV220514840
  • [7] Deng Yuyang, 2020, Adaptive personalized federated learning
  • [8] Dinh CT, 2020, ADV NEUR IN, V33
  • [9] A New Look and Convergence Rate of Federated Multitask Learning With Laplacian Regularization
    Dinh, Canh T.
    Vu, Tung T.
    Tran, Nguyen H.
    Dao, Minh N.
    Zhang, Hongyu
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 8075 - 8085
  • [10] Fallah A, 2020, ADV NEUR IN, V33