Continual Learning with Neuron Activation Importance

被引:1
作者
Kim, Sohee [1 ]
Lee, Seungkyu [1 ]
机构
[1] Kyung Hee Univ, Dept Comp Sci & Engn, 1732 Deogyeong Daero, Yongin, Gyeonggi Do, South Korea
来源
IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT I | 2022年 / 13231卷
关键词
Continual learning; Neuron importance;
D O I
10.1007/978-3-031-06427-2_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning is a concept of online learning with multiple sequential tasks. One of the critical barriers of continual learning is that a network should learn a new task keeping the knowledge of old tasks without access to any data of the old tasks. We propose a neuron activation importance-based regularization method for stable continual learning regardless of the order of tasks. We conduct comprehensive experiments on existing benchmark data sets to evaluate not just the stability and plasticity of our method with improved classification accuracy also the robustness of the performance along the changes of task order.
引用
收藏
页码:310 / 321
页数:12
相关论文
共 23 条
  • [1] Rusu AA, 2016, Arxiv, DOI arXiv:1606.04671
  • [2] Ahn H, 2019, ADV NEUR IN, V32
  • [3] Aljundi R, 2019, Arxiv, DOI arXiv:1806.05421
  • [4] Aljundi R, 2019, ADV NEUR IN, V32
  • [5] Memory Aware Synapses: Learning What (not) to Forget
    Aljundi, Rahaf
    Babiloni, Francesca
    Elhoseiny, Mohamed
    Rohrbach, Marcus
    Tuytelaars, Tinne
    [J]. COMPUTER VISION - ECCV 2018, PT III, 2018, 11207 : 144 - 161
  • [6] Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
  • [7] Goodfellow IJ, 2015, Arxiv, DOI arXiv:1312.6211
  • [8] Javed K, 2019, ADV NEUR IN, V32
  • [9] Jung S., 2020, arXiv
  • [10] Kingma DP, 2014, ADV NEUR IN, V27