Dynamic Memory-Based Continual Learning with Generating and Screening

被引:0
|
作者
Tao, Siying [1 ]
Huang, Jinyang [1 ]
Zhang, Xiang [2 ]
Sun, Xiao [1 ,3 ]
Gu, Yu [4 ]
机构
[1] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Hefei, Peoples R China
[2] Univ Sci & Technol China, Sch Cybers Sci & Technol, Hefei, Peoples R China
[3] Hefei Comprehens Natl Sci Ctr, Inst Artificial Intelligence, Hefei, Peoples R China
[4] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, I Lab, Chengdu, Peoples R China
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III | 2023年 / 14256卷
关键词
Continual Learning; Generative replay; Deep learning;
D O I
10.1007/978-3-031-44213-1_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks suffer from catastrophic forgetting when continually learning new tasks. Although simply replaying all previous data alleviates the problem, it requires large memory and even worse, often infeasible in real-world applications where access to past data is limited. Therefore, We propose a two-stage framework that dynamically reproduces data features of previous tasks to reduce catastrophic forgetting. Specifically, at each task step, we use a new memory module to learn the data distribution of the new task and reproduce pseudo-data from previous memory modules to learn together. This enables us to integrate new visual concepts with retaining learned knowledge to achieve a better stability-malleability balance. We introduce an N-step model fusion strategy to accelerate the memorization process of the memory module and a screening strategy to control the quantity and quality of generated data, reducing distribution differences. We experimented on CIFAR-100, MNIST, and SVHN datasets to demonstrate the effectiveness of our method.
引用
收藏
页码:365 / 376
页数:12
相关论文
共 50 条
  • [1] Memory Enhanced Replay for Continual Learning
    Xu, Guixun
    Guo, Wenhui
    Wang, Yanjiang
    2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 218 - 222
  • [2] Memory Bounds for Continual Learning
    Chen, Xi
    Papadimitriou, Christos
    Peng, Binghui
    2022 IEEE 63RD ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2022, : 519 - 530
  • [3] Recent progress in analog memory-based accelerators for deep learning
    Tsai, Hsinyu
    Ambrogio, Stefano
    Narayanan, Pritish
    Shelby, Robert M.
    Burr, Geoffrey W.
    JOURNAL OF PHYSICS D-APPLIED PHYSICS, 2018, 51 (28)
  • [4] Online continual learning with declarative memory
    Xiao, Zhe
    Du, Zhekai
    Wang, Ruijin
    Gan, Ruimeng
    Li, Jingjing
    NEURAL NETWORKS, 2023, 163 : 146 - 155
  • [5] Condensed Composite Memory Continual Learning
    Wiewe, Felix
    Yan, Bin
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Dynamic learning rates for continual unsupervised learning
    David Fernandez-Rodriguez, Jose
    Jose Palomo, Esteban
    Miguel Ortiz-De-Lazcano-Lobato, Juan
    Ramos-Jimenez, Gonzalo
    Lopez-Rubio, Ezequiel
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2023, 30 (03) : 257 - 273
  • [7] Schematic memory persistence and transience for efficient and robust continual learning
    Gao, Yuyang
    Ascoli, Giorgio A.
    Zhao, Liang
    NEURAL NETWORKS, 2021, 144 : 49 - 60
  • [8] Continual learning-based trajectory prediction with memory augmented networks
    Yang, Biao
    Fan, Fucheng
    Ni, Rongrong
    Li, Jie
    Kiong, Loochu
    Liu, Xiaofeng
    KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [9] Continual and One-Shot Learning Through Neural Networks with Dynamic External Memory
    Luders, Benno
    Schlager, Mikkel
    Korach, Aleksandra
    Risi, Sebastian
    APPLICATIONS OF EVOLUTIONARY COMPUTATION, EVOAPPLICATIONS 2017, PT I, 2017, 10199 : 886 - 901
  • [10] Continual Learning via Dynamic Programming
    Krishnan, R.
    Balaprakash, Prasanna
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1350 - 1356