Class Impression for Data-Free Incremental Learning

被引:3
作者
Ayromlou, Sana [1 ]
Abolmaesumi, Purang [1 ]
Tsang, Teresa [2 ]
Li, Xiaoxiao [1 ]
机构
[1] Univ British Columbia, Vancouver, BC, Canada
[2] Vancouver Gen Hosp, Vancouver, BC, Canada
来源
MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT IV | 2022年 / 13434卷
基金
加拿大自然科学与工程研究理事会; 加拿大健康研究院;
关键词
D O I
10.1007/978-3-031-16440-8_31
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Standard deep learning-based classification approaches require collecting all samples from all classes in advance and are trained offline. This paradigm may not be practical in real-world clinical applications, where new classes are incrementally introduced through the addition of new data. Class incremental learning is a strategy allowing learning from such data. However, a major challenge is catastrophic forgetting, i.e., performance degradation on previous classes when adapting a trained model to new data. To alleviate this challenge, prior methodologies save a portion of training data that require perpetual storage, which may introduce privacy issues. Here, we propose a novel data-free class incremental learning framework that first synthesizes data from the model trained on previous classes to generate a Class Impression. Subsequently, it updates the model by combining the synthesized data with new class data. Furthermore, we incorporate a cosine normalized Cross-entropy loss to mitigate the adverse effects of the imbalance, a margin loss to increase separation among previous classes and new ones, and an infra-domain contrastive loss to generalize the model trained on the synthesized data to real data. We compare our proposed framework with state-of-the-art methods in class incremental learning, where we demonstrate improvement in accuracy for the classification of 11,062 echocardiography cine series of patients. Code is available at https://github.com/sanaAyrml/Class-Impresion-for-Data-free-Incremental-Learning
引用
收藏
页码:320 / 329
页数:10
相关论文
共 27 条
  • [1] Expert Gate: Lifelong Learning with a Network of Experts
    Aljundi, Rahaf
    Chakravarty, Punarjay
    Tuytelaars, Tinne
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 7120 - 7129
  • [2] Biewald L., 2020, Experiment tracking with weights and biases
  • [3] Chaudhry A., 2019, Continual learning with tiny episodic memories
  • [4] A Continual Learning Survey: Defying Forgetting in Classification Tasks
    De Lange, Matthias
    Aljundi, Rahaf
    Masana, Marc
    Parisot, Sarah
    Jia, Xu
    Leonardis, Ales
    Slabaugh, Greg
    Tuytelaars, Tinne
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3366 - 3385
  • [5] Dynamic Few-Shot Visual Learning without Forgetting
    Gidaris, Spyros
    Komodakis, Nikos
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 4367 - 4375
  • [6] Hatamizadeh A, 2023, Arxiv, DOI arXiv:2202.06924
  • [7] Learning a Unified Classifier Incrementally via Rebalancing
    Hou, Saihui
    Pan, Xinyu
    Loy, Chen Change
    Wang, Zilei
    Lin, Dahua
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 831 - 839
  • [8] Huang YS, 2021, ADV NEUR IN, V34
  • [9] Isele D, 2018, AAAI CONF ARTIF INTE, P3302
  • [10] Jung HC, 2016, Arxiv, DOI arXiv:1607.00122