Continual learning via region-aware memory

被引:4
|
作者
Zhao, Kai [1 ]
Fu, Zhenyong [1 ]
Yang, Jian [1 ]
机构
[1] Nanjing Univ Sci & Technol, PCALab, Nanjing 210094, Peoples R China
基金
美国国家科学基金会; 中国博士后科学基金;
关键词
Continual learning; Region-aware memory; Adversarial attack; Diverse samples;
D O I
10.1007/s10489-022-03928-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning for classification is a common learning scenario in practice yet remains an open challenge for deep neural networks (DNNs). The contemporary DNNs suffer from catastrophic forgetting-they are prone to forgetting the previously acquired knowledge when learning new tasks. Storing a small portion of samples of old tasks in an episodic memory and then replaying them when learning new tasks is an effective way to mitigate catastrophic forgetting. Due to the storage constraint, an episodic memory with limited but diverse samples is more preferable for continual learning. To select samples from various regions in the feature space, we propose a region-aware memory (RAM) construction method. Specifically, we exploit adversarial attack to approximately measure the distance of an example to its class decision boundary. Then, we uniformly choose the samples with different distances to the decision boundary, i.e. the samples from various regions, to store in the episodic memory. We evaluate our RAM on CIFAR10, CIFAR100 and ImageNet datasets in the 'blurry' setup Prabhu et al. (2020) and Bang et al. (2021). Experimental results show that our RAM can outperform state-of-the-art methods. In particular, the performance on ImageNet is boosted by 4.82%.
引用
收藏
页码:8389 / 8401
页数:13
相关论文
共 50 条
  • [41] DCFT: Dependency-aware continual learning fine-tuning for sparse LLMs
    Wang, Yanzhe
    Wang, Yizhen
    Yin, Baoqun
    NEUROCOMPUTING, 2025, 636
  • [42] Hessian Aware Low-Rank Perturbation for Order-Robust Continual Learning
    Li, Jiaqi
    Lai, Yuanhao
    Wang, Rui
    Shui, Changjian
    Sahoo, Sabyasachi
    Ling, Charles X.
    Yang, Shichun
    Wang, Boyu
    Gagne, Christian
    Zhou, Fan
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6385 - 6396
  • [43] Continual compression model for online continual learning
    Ye, Fei
    Bors, Adrian G.
    APPLIED SOFT COMPUTING, 2024, 167
  • [44] Triple-Memory Networks: A Brain-Inspired Method for Continual Learning
    Wang, Liyuan
    Lei, Bo
    Li, Qian
    Su, Hang
    Zhu, Jun
    Zhong, Yi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 1925 - 1934
  • [45] Continual portfolio selection in dynamic environments via incremental reinforcement learning
    Shu Liu
    Bo Wang
    Huaxiong Li
    Chunlin Chen
    Zhi Wang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 269 - 279
  • [46] CONTINUAL LEARNING OF PREDICTIVE MODELS IN VIDEO SEQUENCES VIA VARIATIONAL AUTOENCODERS
    Campo, Damian
    Slavic, Giulia
    Baydoun, Mohamad
    Marcenaro, Lucio
    Regazzoni, Carlo
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 753 - 757
  • [47] Long-Term Visual Object Tracking via Continual Learning
    Zhang, Hui
    Zhu, Mu
    Zhang, Jing
    Li Zhuo
    IEEE ACCESS, 2019, 7 : 182548 - 182558
  • [48] Continual portfolio selection in dynamic environments via incremental reinforcement learning
    Liu, Shu
    Wang, Bo
    Li, Huaxiong
    Chen, Chunlin
    Wang, Zhi
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (01) : 269 - 279
  • [49] Logarithmic Continual Learning
    Masarczyk, Wojciech
    Wawrzynski, Pawel
    Marczak, Daniel
    Deja, Kamil
    Trzcinski, Tomasz
    IEEE ACCESS, 2022, 10 : 117001 - 117010
  • [50] Open-world continual learning: Unifying novelty detection and continual learning
    Kim, Gyuhak
    Xiao, Changnan
    Konishi, Tatsuya
    Ke, Zixuan
    Liu, Bing
    ARTIFICIAL INTELLIGENCE, 2025, 338