A Study of Enhancing Federated Learning on Non-IID Data with Server Learning

被引:0
作者
Mai V.S. [1 ]
La R.J. [2 ]
Zhang T. [1 ]
机构
[1] National Institute of Standards and Technology (NIST), Gaithersburg, MD
[2] NIST, University of Maryland, College Park, MD
来源
IEEE Transactions on Artificial Intelligence | 2024年 / 5卷 / 11期
关键词
Analytical models; Convergence; Data models; Distribute Machine Learning; Federated learning; Federated Learning; Non-IID Data; Servers; Training; Training data;
D O I
10.1109/TAI.2024.3430250
中图分类号
学科分类号
摘要
Federated Learning (FL) has emerged as a means of distributed learning using local data stored at clients with a coordinating server. Recent studies showed that FL can suffer from poor performance and slower convergence when training data at the clients are not independent and identically distributed (IID). Here, we consider auxiliary server learning as a <italic>complementary</italic> approach to improving the performance of FL on non-IID data. Our analysis and experiments show that this approach can achieve significant improvements in both model accuracy and convergence time even when the dataset utilized by the server is small and its distribution differs from that of the clients&#x2019; aggregate data. Moreover, experimental results suggest that auxiliary server learning delivers benefits when employed together with other techniques proposed to mitigate the performance degradation of FL on non-IID data. IEEE
引用
收藏
页码:1 / 15
页数:14
相关论文
empty
未找到相关数据