Continual learning via region-aware memory

被引:4
|
作者
Zhao, Kai [1 ]
Fu, Zhenyong [1 ]
Yang, Jian [1 ]
机构
[1] Nanjing Univ Sci & Technol, PCALab, Nanjing 210094, Peoples R China
基金
美国国家科学基金会; 中国博士后科学基金;
关键词
Continual learning; Region-aware memory; Adversarial attack; Diverse samples;
D O I
10.1007/s10489-022-03928-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning for classification is a common learning scenario in practice yet remains an open challenge for deep neural networks (DNNs). The contemporary DNNs suffer from catastrophic forgetting-they are prone to forgetting the previously acquired knowledge when learning new tasks. Storing a small portion of samples of old tasks in an episodic memory and then replaying them when learning new tasks is an effective way to mitigate catastrophic forgetting. Due to the storage constraint, an episodic memory with limited but diverse samples is more preferable for continual learning. To select samples from various regions in the feature space, we propose a region-aware memory (RAM) construction method. Specifically, we exploit adversarial attack to approximately measure the distance of an example to its class decision boundary. Then, we uniformly choose the samples with different distances to the decision boundary, i.e. the samples from various regions, to store in the episodic memory. We evaluate our RAM on CIFAR10, CIFAR100 and ImageNet datasets in the 'blurry' setup Prabhu et al. (2020) and Bang et al. (2021). Experimental results show that our RAM can outperform state-of-the-art methods. In particular, the performance on ImageNet is boosted by 4.82%.
引用
收藏
页码:8389 / 8401
页数:13
相关论文
共 50 条
  • [1] Continual learning via region-aware memory
    Kai Zhao
    Zhenyong Fu
    Jian Yang
    Applied Intelligence, 2023, 53 : 8389 - 8401
  • [2] Continual learning for seizure prediction via memory projection strategy
    Shi, Yufei
    Tang, Shishi
    Li, Yuxuan
    He, Zhipeng
    Tang, Shengsheng
    Wang, Ruixuan
    Zheng, Weishi
    Chen, Ziyi
    Zhou, Yi
    Computers in Biology and Medicine, 2024, 181
  • [3] Memory Bounds for Continual Learning
    Chen, Xi
    Papadimitriou, Christos
    Peng, Binghui
    2022 IEEE 63RD ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2022, : 519 - 530
  • [4] Online continual learning with declarative memory
    Xiao, Zhe
    Du, Zhekai
    Wang, Ruijin
    Gan, Ruimeng
    Li, Jingjing
    NEURAL NETWORKS, 2023, 163 : 146 - 155
  • [5] Condensed Composite Memory Continual Learning
    Wiewe, Felix
    Yan, Bin
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Memory Enhanced Replay for Continual Learning
    Xu, Guixun
    Guo, Wenhui
    Wang, Yanjiang
    2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 218 - 222
  • [7] Memory-aware continual learning with multi-modal social media streams for disaster classification
    Mao, Yiqiao
    Yan, Xiaoqiang
    Hu, Zirui
    Zhang, Xuguang
    Ye, Yangdong
    Yu, Hui
    ADVANCED ENGINEERING INFORMATICS, 2024, 62
  • [8] Continual learning of neural networks for quality prediction in production using memory aware synapses and weight transfer
    Tercan, Hasan
    Deibert, Philipp
    Meisen, Tobias
    JOURNAL OF INTELLIGENT MANUFACTURING, 2022, 33 (01) : 283 - 292
  • [9] Continual learning of neural networks for quality prediction in production using memory aware synapses and weight transfer
    Hasan Tercan
    Philipp Deibert
    Tobias Meisen
    Journal of Intelligent Manufacturing, 2022, 33 : 283 - 292
  • [10] PositCL: Compact Continual Learning with Posit Aware Quantization
    Karia, Vedant
    Zyarah, Abdullah M.
    Kudithipudi, Dhireesha
    PROCEEDING OF THE GREAT LAKES SYMPOSIUM ON VLSI 2024, GLSVLSI 2024, 2024, : 645 - 650