Continual and One-Shot Learning Through Neural Networks with Dynamic External Memory

被引:13
|
作者
Luders, Benno [1 ]
Schlager, Mikkel [1 ]
Korach, Aleksandra [1 ]
Risi, Sebastian [1 ]
机构
[1] IT Univ Copenhagen, Copenhagen, Denmark
来源
APPLICATIONS OF EVOLUTIONARY COMPUTATION, EVOAPPLICATIONS 2017, PT I | 2017年 / 10199卷
关键词
Neural Turing Machine; Continual learning; Adaptive neural networks; Plasticity; Memory; Neuroevolution;
D O I
10.1007/978-3-319-55849-3_57
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Training neural networks to quickly learn new skills without forgetting previously learned skills is an important open challenge in machine learning. A common problem for adaptive networks that can learn during their lifetime is that the weights encoding a particular task are often overridden when a new task is learned. This paper takes a step in overcoming this limitation by building on the recently proposed Evolving Neural Turing Machine (ENTM) approach. In the ENTM, neural networks are augmented with an external memory component that they can write to and read from, which allows them to store associations quickly and over long periods of time. The results in this paper demonstrate that the ENTM is able to perform one-shot learning in reinforcement learning tasks without catastrophic forgetting of previously stored associations. Additionally, we introduce a new ENTM default jump mechanism that makes it easier to find unused memory location and therefor facilitates the evolution of continual learning networks. Our results suggest that augmenting evolving networks with an external memory component is not only a viable mechanism for adaptive behaviors in neuroevolution but also allows these networks to perform continual and one-shot learning at the same time.
引用
收藏
页码:886 / 901
页数:16
相关论文
共 50 条
  • [41] Edge Continual Learning for Dynamic Digital Twins over Wireless Networks
    Hashash, Omar
    Chaccour, Christina
    Saad, Walid
    2022 IEEE 23RD INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATION (SPAWC), 2022,
  • [42] CONTINUAL LEARNING THROUGH ONE-CLASS CLASSIFICATION USING VAE
    Wiewel, Felix
    Brendle, Andreas
    Yang, Bin
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3307 - 3311
  • [43] Continual Few-Shot Relation Learning Via Dynamic Margin Loss and Space Recall
    Li, Yongbing
    Duan, Pengfei
    Rong, Yi
    Yang, Yiwen
    Wang, Aoxing
    2024 16TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND COMPUTING, ICMLC 2024, 2024, : 492 - 497
  • [44] Triple-Memory Networks: A Brain-Inspired Method for Continual Learning
    Wang, Liyuan
    Lei, Bo
    Li, Qian
    Su, Hang
    Zhu, Jun
    Zhong, Yi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 1925 - 1934
  • [45] Learned Spatial Schemas and Prospective Hippocampal Activity Support Navigation After One-Shot Learning
    van Kesteren, Marlieke T. R.
    Brown, Thackery I.
    Wagner, Anthony D.
    FRONTIERS IN HUMAN NEUROSCIENCE, 2018, 12
  • [46] Gating Mechanism in Deep Neural Networks for Resource-Efficient Continual Learning
    Jin, Hyundong
    Yun, Kimin
    Kim, Eunwoo
    IEEE ACCESS, 2022, 10 : 18776 - 18786
  • [47] Memory-Dependent Computation and Learning in Spiking Neural Networks Through Hebbian Plasticity
    Limbacher, Thomas
    Ozdenizci, Ozan
    Legenstein, Robert
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 2551 - 2562
  • [49] The reliability and internal consistency of one-shot and flicker change detection for measuring individual differences in visual working memory capacity
    Hrag Pailian
    Justin Halberda
    Memory & Cognition, 2015, 43 : 397 - 420
  • [50] The reliability and internal consistency of one-shot and flicker change detection for measuring individual differences in visual working memory capacity
    Pailian, Hrag
    Halberda, Justin
    MEMORY & COGNITION, 2015, 43 (03) : 397 - 420