CRNet: A Fast Continual Learning Framework With Random Theory

被引:9
作者
Li, Depeng [1 ,2 ,3 ]
Zeng, Zhigang [1 ,2 ,3 ]
机构
[1] Sch Artificial Intelligence & Automat, Wuhan 430074, Peoples R China
[2] Huazhong Univ Sci & Technol, Inst Artificial Intelligence, Wuhan 430074, Peoples R China
[3] Minist China, Key Lab Image Proc & Intelligent Control Educ, Wuhan 430074, Peoples R China
基金
中国国家自然科学基金;
关键词
Catastrophic forgetting; continual learning; incremental learning; randomized learning technique; random vector functional-link network; STOCHASTIC CONFIGURATION NETWORKS; NEURAL-NETWORKS; APPROXIMATION;
D O I
10.1109/TPAMI.2023.3262853
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural networks are prone to suffer from catastrophic forgetting. Networks trained on something new tend to rapidly forget what was learned previously, a common phenomenon within connectionist models. In this work, we propose an effective and efficient continual learning framework using random theory, together with Bayes' rule, to equip a single model with the ability to learn streaming data. The core idea of our framework is to preserve the performance of old tasks by guiding output weights to stay in a region of low error while encountering new tasks. In contrast to the existing continual learning approaches, our main contributions concern (1) closed-formed solutions with detailed theoretical analysis; (2) training continual learners by one-pass observation of samples; (3) remarkable advantages in terms of easy implementation, efficient parameters, fast convergence, and strong task-order robustness. Comprehensive experiments under popular image classification benchmarks, FashionMNIST, CIFAR-100, and ImageNet, demonstrate that our methods predominately outperform the extensive state-of-the-art methods on training speed while maintaining superior accuracy and the number of parameters, in the class incremental learning scenario. Code is available at https://github.com/toil2sweet/CRNet.
引用
收藏
页码:10731 / 10744
页数:14
相关论文
共 77 条
  • [1] Rusu AA, 2016, Arxiv, DOI arXiv:1606.04671
  • [2] Ahn H., 2021, P IEEECVF INT C COMP, P844
  • [3] Memory Aware Synapses: Learning What (not) to Forget
    Aljundi, Rahaf
    Babiloni, Francesca
    Elhoseiny, Mohamed
    Rohrbach, Marcus
    Tuytelaars, Tinne
    [J]. COMPUTER VISION - ECCV 2018, PT III, 2018, 11207 : 144 - 161
  • [4] Rainbow Memory: Continual Learning with a Memory of Diverse Samples
    Bang, Jihwan
    Kim, Heesu
    Yoo, YoungJoon
    Ha, Jung-Woo
    Choi, Jonghyun
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 8214 - 8223
  • [5] IL2M: Class Incremental Learning With Dual Memory
    Belouadah, Eden
    Popescu, Adrian
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 583 - 592
  • [6] Chaudhry A., 2019, PROC INT C LEARN REP
  • [7] Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture
    Chen, C. L. Philip
    Liu, Zhulin
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (01) : 10 - 24
  • [8] Stochastic configuration networks with block increments for data modeling in process industries
    Dai, Wei
    Li, Depeng
    Zhou, Ping
    Chai, Tianyou
    [J]. INFORMATION SCIENCES, 2019, 484 : 367 - 386
  • [9] A Continual Learning Survey: Defying Forgetting in Classification Tasks
    De Lange, Matthias
    Aljundi, Rahaf
    Masana, Marc
    Parisot, Sarah
    Jia, Xu
    Leonardis, Ales
    Slabaugh, Greg
    Tuytelaars, Tinne
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3366 - 3385
  • [10] Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848