Concept drift detection and adaptation for federated and continual learning

被引:0
作者
Fernando E. Casado
Dylan Lema
Marcos F. Criado
Roberto Iglesias
Carlos V. Regueiro
Senén Barro
机构
[1] Universidade de Santiago de Compostela,CiTIUS (Centro Singular de Investigación en Tecnoloxías Intelixentes)
[2] Universidade da Coruña,CITIC, Computer Architecture Group
来源
Multimedia Tools and Applications | 2022年 / 81卷
关键词
Federated learning; Continual learning; Nonstationarity; Concept drift; Federated Averaging; Catastrophic forgetting; Rehearsal;
D O I
暂无
中图分类号
学科分类号
摘要
Smart devices, such as smartphones, wearables, robots, and others, can collect vast amounts of data from their environment. This data is suitable for training machine learning models, which can significantly improve their behavior, and therefore, the user experience. Federated learning is a young and popular framework that allows multiple distributed devices to train deep learning models collaboratively while preserving data privacy. Nevertheless, this approach may not be optimal for scenarios where data distribution is non-identical among the participants or changes over time, causing what is known as concept drift. Little research has yet been done in this field, but this kind of situation is quite frequent in real life and poses new challenges to both continual and federated learning. Therefore, in this work, we present a new method, called Concept-Drift-Aware Federated Averaging (CDA-FedAvg). Our proposal is an extension of the most popular federated algorithm, Federated Averaging (FedAvg), enhancing it for continual adaptation under concept drift. We empirically demonstrate the weaknesses of regular FedAvg and prove that CDA-FedAvg outperforms it in this type of scenario.
引用
收藏
页码:3397 / 3419
页数:22
相关论文
共 31 条
  • [1] Aledhari M(2020)Federated learning: A survey on enabling technologies, protocols, and applications IEEE Access 8 140,699-140,725
  • [2] Razzak R(1966)Minimization of functions having lipschitz continuous first partial derivatives Pacific J Math 16 1-3
  • [3] Parizi RM(1999)Convergence rates of change-point estimators and tail probabilities of the first-passage-time process Can J Stat 27 183-197
  • [4] Saeed F(1999)Catastrophic forgetting in connectionist networks Trends Cogn Sci 3 128-135
  • [5] Armijo L(2014)Privacy and big data Computer 47 7-9
  • [6] Baron M(1988)Nonlinear neural networks: Principles, mechanisms, and architectures Neural Netw 1 17-61
  • [7] French RM(2020)Continual learning for robotics: Definition, framework, learning strategies, opportunities and challenges Inf Fusion 58 52-68
  • [8] Gaff BM(2020)Federated learning: Challenges, methods, and future directions IEEE Signal Process Mag 37 50-60
  • [9] Sussman HE(2014)Fusion of smartphone motion sensors for physical activity recognition Sensors 14 10,146-10,176
  • [10] Geetter J(2016)Characterizing concept drift Data Min Knowl Disc 30 964-994