Continual Learning Using Bayesian Neural Networks

被引:24
|
作者
Li, Honglin [1 ,2 ]
Barnaghi, Payam [1 ,2 ,3 ]
Enshaeifare, Shirin [1 ,2 ]
Ganz, Frieder [4 ]
机构
[1] Univ Surrey, Ctr Vis Speech & Signal Proc CVSSP, Guildford GU2 7XH, Surrey, England
[2] UK Dementia Res Inst UK DRI, Care Res & Technol Ctr, London WC1E 6BT, England
[3] Imperial Coll London, Dept Brain Sci, London SW7 2BU, England
[4] Adobe Syst Engn GmbH, D-22767 Hamburg, Germany
关键词
Task analysis; Adaptation models; Training; Bayes methods; Modeling; Uncertainty; Gaussian distribution; Bayesian neural networks (BNNs); catastrophic forgetting; continual learning; incremental learning; uncertainty; SYSTEMS;
D O I
10.1109/TNNLS.2020.3017292
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning models allow them to learn and adapt to new changes and tasks over time. However, in continual and sequential learning scenarios, in which the models are trained using different data with various distributions, neural networks (NNs) tend to forget the previously learned knowledge. This phenomenon is often referred to as catastrophic forgetting. The catastrophic forgetting is an inevitable problem in continual learning models for dynamic environments. To address this issue, we propose a method, called continual Bayesian learning networks (CBLNs), which enables the networks to allocate additional resources to adapt to new tasks without forgetting the previously learned tasks. Using a Bayesian NN, CBLN maintains a mixture of Gaussian posterior distributions that are associated with different tasks. The proposed method tries to optimize the number of resources that are needed to learn each task and avoids an exponential increase in the number of resources that are involved in learning multiple tasks. The proposed method does not need to access the past training data and can choose suitable weights to classify the data points during the test time automatically based on an uncertainty criterion. We have evaluated the method on the MNIST and UCR time-series data sets. The evaluation results show that the method can address the catastrophic forgetting problem at a promising rate compared to the state-of-the-art models.
引用
收藏
页码:4243 / 4252
页数:10
相关论文
共 50 条
  • [1] Bayesian continual learning via spiking neural networks
    Skatchkovsky, Nicolas
    Jang, Hyeryung
    Simeone, Osvaldo
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2022, 16
  • [2] Hierarchical Indian Buffet Neural Networks for Bayesian Continual Learning
    Kessler, Samuel
    Vu Nguyen
    Zohren, Stefan
    Roberts, Stephen J.
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 161, 2021, 161 : 749 - 759
  • [3] Learning from the Past: Continual Meta-Learning with Bayesian Graph Neural Networks
    Luo, Yadan
    Huang, Zi
    Zhang, Zheng
    Wang, Ziwei
    Baktashmotlagh, Mahsa
    Yang, Yang
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5021 - 5028
  • [4] Continual Learning with Neural Networks: A Review
    Awasthi, Abhijeet
    Sarawagi, Sunita
    PROCEEDINGS OF THE 6TH ACM IKDD CODS AND 24TH COMAD, 2019, : 362 - 365
  • [5] CONTINUAL LEARNING ON FACIAL RECOGNITION USING CONVOLUTIONAL NEURAL NETWORKS
    Feng, Jingjing
    Gomez, Valentina
    UPB Scientific Bulletin, Series C: Electrical Engineering and Computer Science, 2023, 85 (03): : 239 - 248
  • [6] CONTINUAL LEARNING ON FACIAL RECOGNITION USING CONVOLUTIONAL NEURAL NETWORKS
    Feng, Jingjing
    Gomez, Valentina
    UNIVERSITY POLITEHNICA OF BUCHAREST SCIENTIFIC BULLETIN SERIES C-ELECTRICAL ENGINEERING AND COMPUTER SCIENCE, 2023, 85 (03): : 239 - 248
  • [7] Continual robot learning with constructive neural networks
    Grossmann, A
    Poli, R
    LEARNING ROBOTS, PROCEEDINGS, 1998, 1545 : 95 - 108
  • [8] Continual Learning with Sparse Progressive Neural Networks
    Ergun, Esra
    Toreyin, Behcet Ugur
    2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [9] Sparse Progressive Neural Networks for Continual Learning
    Ergun, Esra
    Toreyin, Behcet Ugur
    ADVANCES IN COMPUTATIONAL COLLECTIVE INTELLIGENCE (ICCCI 2021), 2021, 1463 : 715 - 725
  • [10] Employing Convolutional Neural Networks for Continual Learning
    Jasinski, Marcin
    Wozniak, Michal
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2022, PT I, 2023, 13588 : 288 - 297