Continual and One-Shot Learning Through Neural Networks with Dynamic External Memory

被引:13
|
作者
Luders, Benno [1 ]
Schlager, Mikkel [1 ]
Korach, Aleksandra [1 ]
Risi, Sebastian [1 ]
机构
[1] IT Univ Copenhagen, Copenhagen, Denmark
来源
APPLICATIONS OF EVOLUTIONARY COMPUTATION, EVOAPPLICATIONS 2017, PT I | 2017年 / 10199卷
关键词
Neural Turing Machine; Continual learning; Adaptive neural networks; Plasticity; Memory; Neuroevolution;
D O I
10.1007/978-3-319-55849-3_57
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Training neural networks to quickly learn new skills without forgetting previously learned skills is an important open challenge in machine learning. A common problem for adaptive networks that can learn during their lifetime is that the weights encoding a particular task are often overridden when a new task is learned. This paper takes a step in overcoming this limitation by building on the recently proposed Evolving Neural Turing Machine (ENTM) approach. In the ENTM, neural networks are augmented with an external memory component that they can write to and read from, which allows them to store associations quickly and over long periods of time. The results in this paper demonstrate that the ENTM is able to perform one-shot learning in reinforcement learning tasks without catastrophic forgetting of previously stored associations. Additionally, we introduce a new ENTM default jump mechanism that makes it easier to find unused memory location and therefor facilitates the evolution of continual learning networks. Our results suggest that augmenting evolving networks with an external memory component is not only a viable mechanism for adaptive behaviors in neuroevolution but also allows these networks to perform continual and one-shot learning at the same time.
引用
收藏
页码:886 / 901
页数:16
相关论文
共 50 条
  • [1] Continual lifelong learning with neural networks: A review
    Parisi, German I.
    Kemker, Ronald
    Part, Jose L.
    Kanan, Christopher
    Wermter, Stefan
    NEURAL NETWORKS, 2019, 113 : 54 - 71
  • [2] Neural inhibition for continual learning and memory
    Barron, Helen C.
    CURRENT OPINION IN NEUROBIOLOGY, 2021, 67 : 85 - 94
  • [3] Continual Learning with Neural Networks: A Review
    Awasthi, Abhijeet
    Sarawagi, Sunita
    PROCEEDINGS OF THE 6TH ACM IKDD CODS AND 24TH COMAD, 2019, : 362 - 365
  • [4] One-shot Learning Approach for Unknown Malware Classification
    Tran, Trung Kien
    Sato, Hiroshi
    Kubo, Masao
    PROCEEDINGS OF THE 2018 5TH ASIAN CONFERENCE ON DEFENSE TECHNOLOGY (ACDT 2018), 2018, : 8 - 13
  • [5] One-shot learning goes 3D
    Zhao, Zijian
    Deng, Shan
    Jiang, Zhouhang
    Ni, Kai
    NATURE ELECTRONICS, 2021, 4 (12) : 866 - 867
  • [6] Continual Learning Using Bayesian Neural Networks
    Li, Honglin
    Barnaghi, Payam
    Enshaeifare, Shirin
    Ganz, Frieder
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (09) : 4243 - 4252
  • [7] Continual Learning with Sparse Progressive Neural Networks
    Ergun, Esra
    Toreyin, Behcet Ugur
    2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [8] Sparse Progressive Neural Networks for Continual Learning
    Ergun, Esra
    Toreyin, Behcet Ugur
    ADVANCES IN COMPUTATIONAL COLLECTIVE INTELLIGENCE (ICCCI 2021), 2021, 1463 : 715 - 725
  • [9] Does prediction error drive one-shot declarative learning?
    Greve, Andrea
    Cooper, Elisa
    Kaula, Alexander
    Anderson, Michael C.
    Henson, Richard
    JOURNAL OF MEMORY AND LANGUAGE, 2017, 94 : 149 - 165
  • [10] Convolutional Neural Network With Developmental Memory for Continual Learning
    Park, Gyeong-Moon
    Yoo, Sahng-Min
    Kim, Jong-Hwan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (06) : 2691 - 2705