Computationally Efficient Rehearsal for Online Continual Learning

被引:2
作者
Davalas, Charalampos [1 ]
Michail, Dimitrios [1 ]
Diou, Christos [1 ]
Varlamis, Iraklis [1 ]
Tserpes, Konstantinos [1 ]
机构
[1] Harokopio Univ Athens, Dept Informat & Telemat, Athens 17778, Greece
来源
IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT III | 2022年 / 13233卷
关键词
Catastrophic forgetting; Continual learning; Online learning;
D O I
10.1007/978-3-031-06433-3_4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning is a crucial ability for learning systems that have to adapt to changing data distributions, without reducing their performance in what they have already learned. Rehearsal methods offer a simple countermeasure to help avoid this catastrophic forgetting which frequently occurs in dynamic situations and is a major limitation of machine learning models. These methods continuously train neural networks using a mix of data both from the stream and from a rehearsal buffer, which maintains past training samples. Although the rehearsal approach is reasonable and simple to implement, its effectiveness and efficiency is significantly affected by several hyperparameters such as the number of training iterations performed at each step, the choice of learning rate, and the choice on whether to retrain the agent at each step. These options are especially important in resource-constrained environments commonly found in online continual learning for image analysis. This work evaluates several rehearsal training strategies for continual online learning and proposes the combined use of a drift detector that decides on (a) when to train using data from the buffer and the online stream, and (b) how to train, based on a combination of heuristics. Experiments on the MNIST and CIFAR-10 image classification datasets demonstrate the effectiveness of the proposed approach over baseline training strategies at a fraction of the computational cost.
引用
收藏
页码:39 / 49
页数:11
相关论文
共 18 条
  • [1] Bottou L, 2012, OPTIMIZATION FOR MACHINE LEARNING, P351
  • [2] Chaudhry A., 2019, Continual learning with tiny episodic memories
  • [3] A Continual Learning Survey: Defying Forgetting in Classification Tasks
    De Lange, Matthias
    Aljundi, Rahaf
    Masana, Marc
    Parisot, Sarah
    Jia, Xu
    Leonardis, Ales
    Slabaugh, Greg
    Tuytelaars, Tinne
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3366 - 3385
  • [4] Demosthenous G., 2021, arXiv
  • [5] Graffieti G., 2019, ABS191201100 CORR
  • [6] He JP, 2020, PROC CVPR IEEE, P13923, DOI 10.1109/CVPR42600.2020.01394
  • [7] He K., 2016, P IEEE C COMP VIS PA, P770, DOI DOI 10.1109/CVPR.2016.90
  • [8] Krizhevsky A., 2009, Tech. rep.
  • [9] Gradient-based learning applied to document recognition
    Lecun, Y
    Bottou, L
    Bengio, Y
    Haffner, P
    [J]. PROCEEDINGS OF THE IEEE, 1998, 86 (11) : 2278 - 2324
  • [10] Learning without Forgetting
    Li, Zhizhong
    Hoiem, Derek
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (12) : 2935 - 2947