Continual Learning Using Bayesian Neural Networks

被引:23
|
作者
Li, Honglin [1 ,2 ]
Barnaghi, Payam [1 ,2 ,3 ]
Enshaeifare, Shirin [1 ,2 ]
Ganz, Frieder [4 ]
机构
[1] Univ Surrey, Ctr Vis Speech & Signal Proc CVSSP, Guildford GU2 7XH, Surrey, England
[2] UK Dementia Res Inst UK DRI, Care Res & Technol Ctr, London WC1E 6BT, England
[3] Imperial Coll London, Dept Brain Sci, London SW7 2BU, England
[4] Adobe Syst Engn GmbH, D-22767 Hamburg, Germany
关键词
Task analysis; Adaptation models; Training; Bayes methods; Modeling; Uncertainty; Gaussian distribution; Bayesian neural networks (BNNs); catastrophic forgetting; continual learning; incremental learning; uncertainty; SYSTEMS;
D O I
10.1109/TNNLS.2020.3017292
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning models allow them to learn and adapt to new changes and tasks over time. However, in continual and sequential learning scenarios, in which the models are trained using different data with various distributions, neural networks (NNs) tend to forget the previously learned knowledge. This phenomenon is often referred to as catastrophic forgetting. The catastrophic forgetting is an inevitable problem in continual learning models for dynamic environments. To address this issue, we propose a method, called continual Bayesian learning networks (CBLNs), which enables the networks to allocate additional resources to adapt to new tasks without forgetting the previously learned tasks. Using a Bayesian NN, CBLN maintains a mixture of Gaussian posterior distributions that are associated with different tasks. The proposed method tries to optimize the number of resources that are needed to learn each task and avoids an exponential increase in the number of resources that are involved in learning multiple tasks. The proposed method does not need to access the past training data and can choose suitable weights to classify the data points during the test time automatically based on an uncertainty criterion. We have evaluated the method on the MNIST and UCR time-series data sets. The evaluation results show that the method can address the catastrophic forgetting problem at a promising rate compared to the state-of-the-art models.
引用
收藏
页码:4243 / 4252
页数:10
相关论文
共 50 条
  • [31] Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
    Chernoded, Andrey
    Dudko, Lev
    Myagkov, Igor
    Volkov, Petr
    XXIII INTERNATIONAL WORKSHOP HIGH ENERGY PHYSICS AND QUANTUM FIELD THEORY (QFTHEP 2017), 2017, 158
  • [32] Auto claim fraud detection using Bayesian learning neural networks
    Viaene, S
    Dedene, G
    Derrig, RA
    EXPERT SYSTEMS WITH APPLICATIONS, 2005, 29 (03) : 653 - 666
  • [33] Spiking Generative Adversarial Networks With a Neural Network Discriminator: Local Training, Bayesian Models, and Continual Meta-Learning
    Rosenfeld, Bleema
    Simeone, Osvaldo
    Rajendran, Bipin
    IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (11) : 2778 - 2791
  • [34] On Sequential Bayesian Inference for Continual Learning
    Kessler, Samuel
    Cobb, Adam
    Rudner, Tim G. J.
    Zohren, Stefan
    Roberts, Stephen J.
    ENTROPY, 2023, 25 (06)
  • [35] Bayesian Structural Adaptation for Continual Learning
    Kumar, Abhishek
    Chatterjee, Sunabha
    Rai, Piyush
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [36] Bayesian learning for neural networks: an algorithmic survey
    Magris, Martin
    Iosifidis, Alexandros
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (10) : 11773 - 11823
  • [37] Sequential Bayesian learning for modular neural networks
    Wang, P
    Fan, Z
    Li, YF
    Feng, S
    ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 652 - 659
  • [38] Efficacy of Bayesian Neural Networks in Active Learning
    Rakesh, Vineeth
    Jain, Swayambhoo
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2601 - 2609
  • [39] Bayesian Nonparametric Federated Learning of Neural Networks
    Yurochkin, Mikhail
    Agarwal, Mayank
    Ghosh, Soumya
    Greenewald, Kristjan
    Hoang, Trong Nghia
    Khazaeni, Yasaman
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [40] Bayesian learning for neural networks: an algorithmic survey
    Martin Magris
    Alexandros Iosifidis
    Artificial Intelligence Review, 2023, 56 : 11773 - 11823