Asynchronous Federated Learning for Sensor Data with Concept Drift

被引:26
作者
Chen, Yujing [1 ]
Chai, Zheng [1 ]
Cheng, Yue [1 ]
Rangwala, Huzefa [1 ]
机构
[1] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
来源
2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA) | 2021年
关键词
federated learning; asynchronous learning; concept drift; communication-efficient; CLASSIFICATION;
D O I
10.1109/BigData52589.2021.9671924
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) involves multiple distributed devices jointly training a shared model without any of the participants having to reveal their local data to a centralized server. Most of previous FL approaches assume that data on devices are fixed and stationary during the training process. However, this assumption is unrealistic because these devices usually have varying sampling rates and different system configurations. In addition, the underlying distribution of the device data can change dynamically over time, which is known as concept drift. Concept drift makes the learning process complicated because of the inconsistency between existing and upcoming data. Traditional concept drift handling techniques such as chunk based and ensemble learning-based methods are not suitable in the federated learning frameworks due to the heterogeneity of local devices. We propose a novel approach, FedConD, to detect and deal with the concept drift on local devices and minimize the effect on the performance of models in asynchronous FL. The drift detection strategy is based on an adaptive mechanism which uses the historical performance of the local models. The drift adaptation is realized by adjusting the regularization parameter of objective function on each local device. Additionally, we design a communication strategy on the server side to select local updates in a prudent fashion and speed up model convergence. Experimental evaluations on three evolving data streams and two image datasets show that FedConD detects and handles concept drift, and also reduces the overall communication cost compared to other baseline methods.
引用
收藏
页码:4822 / 4831
页数:10
相关论文
共 50 条
[1]  
Agarwal A, 2012, IEEE DECIS CONTR P, P5451, DOI 10.1109/CDC.2012.6426626
[2]  
Ang HH, 2010, LECT NOTES ARTIF INT, V6321, P24
[3]  
[Anonymous], 2019, PROC EUR CONF ANTENN
[4]  
[Anonymous], 2014, ARXIV14126651
[5]  
Aybat NS, 2015, PR MACH LEARN RES, V37, P2454
[6]  
Bennis M., 2019, IEEE SPECTRUM TECHNO, V25
[7]  
BIFET A, 2010, JOINT EUR C MACH LEA, V6321, P135
[8]   Reacting to Different Types of Concept Drift: The Accuracy Updated Ensemble Algorithm [J].
Brzezinski, Dariusz ;
Stefanowski, Jerzy .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (01) :81-94
[9]  
Chai Z., 2020, ARXIV201005958
[10]   Asynchronous Online Federated Learning for Edge Devices with Non-IID Data [J].
Chen, Yujing ;
Ning, Yue ;
Slawski, Martin ;
Rangwala, Huzefa .
2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, :15-24