Adaptive Personalized Federated Learning for Non-IID Data with Continual Distribution Shift

被引:0
作者
Chen, Sisi [1 ]
Liu, Weijie [1 ]
Zhang, Xiaoxi [1 ]
Xu, Hong [2 ]
Lin, Wanyu [3 ]
Chen, Xu [1 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Peoples R China
[2] Chinese Univ Hong Kong, Hong Kong, Peoples R China
[3] Hong Kong Polytech Univ, Hong Kong, Peoples R China
来源
2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS | 2024年
关键词
D O I
10.1109/IWQoS61813.2024.10682851
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) has surged in popularity, allowing machine learning models to be collaboratively trained using decentralized client data, all while upholding privacy and security standards. However, leveraging locally-stored data introduces challenges related to data heterogeneity. While many past studies have addressed this non-IID problem, they often overlook the dynamic nature of each individual client's data or disrupt its continuous shift. In this paper, our emphasis is on the challenges posed by temporal data distribution shift alongside non-IID data across clients, a more prevalent yet complex situation in real-world FL. We propose to analytically capture the evolving nature of each local data distribution, by modeling them as a time-varying composite of multiple latent Gaussian distributions. We then employ the expectation maximization (EM) algorithm to deduce the distribution model parameters based on the prevailing observed training data, ensuring that the learned mixture proportion weights mirror a consistent trajectory. Additionally, by embedding an adaptive data partitioning method into the EM algorithm and using each partition to train a distinct sub-model, we realize an intuitive and novel personalized FL paradigm. This refines the FL training by exploiting the heterogeneity and temporal shifts of clients' datasets. We derive analytical results to guarantee the convergence of our training method. Comprehensive tests across diverse datasets and distribution configurations also underscore our enhanced efficacy compared to several state-of-the-art.
引用
收藏
页数:6
相关论文
共 28 条
[1]  
Caldas S., 2018, arXiv
[2]  
Collins L, 2021, PR MACH LEARN RES, V139
[3]   MAXIMUM LIKELIHOOD FROM INCOMPLETE DATA VIA EM ALGORITHM [J].
DEMPSTER, AP ;
LAIRD, NM ;
RUBIN, DB .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1977, 39 (01) :1-38
[4]  
Ding YC, 2020, Arxiv, DOI arXiv:2002.07454
[5]   FedGroup: Efficient Federated Learning via Decomposed Similarity-Based Clustering [J].
Duan, Moming ;
Liu, Duo ;
Ji, Xinyuan ;
Liu, Renping ;
Liang, Liang ;
Chen, Xianzhang ;
Tan, Yujuan .
19TH IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED PROCESSING WITH APPLICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2021), 2021, :228-237
[6]   Astraea: Self-balancing Federated Learning for Improving Classification Accuracy of Mobile Deep Learning Applications [J].
Duan, Moming ;
Liu, Duo ;
Chen, Xianzhang ;
Tan, Yujuan ;
Ren, Jinting ;
Qiao, Lei ;
Liang, Liang .
2019 IEEE 37TH INTERNATIONAL CONFERENCE ON COMPUTER DESIGN (ICCD 2019), 2019, :246-254
[7]  
Eichner H, 2019, PR MACH LEARN RES, V97
[8]  
Fallah A, 2020, ADV NEUR IN, V33
[9]  
Fischer H., 2011, A History of the Central Limit Theorem: From Classical to Modern Probability Theory, V4
[10]   An Efficient Framework for Clustered Federated Learning [J].
Ghosh, Avishek ;
Chung, Jichan ;
Yin, Dong ;
Ramchandran, Kannan .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (12) :8076-8091