Exemplar-Based Continual Learning via Contrastive Learning

被引:2
|
作者
Chen S. [1 ,2 ]
Zhang M. [2 ]
Zhang J. [2 ]
Huang K. [2 ,3 ,4 ]
机构
[1] University of Chinese Academy of Sciences (UCAS), School of Artificial Intelligence, Beijing
[2] Chinese Academy of Sciences (CASIA), Center for Research on Intelligent System and Engineering (CRISE), Institute of Automation, Beijing
[3] University of Chinese Academy of Sciences, School of Artificial Intelligence, Beijing
[4] CAS Center for Excellence in Brain Science and Intelligence Technology, Shanghai
来源
IEEE Transactions on Artificial Intelligence | 2024年 / 5卷 / 07期
关键词
Continual learning (CL); contrastive learning; incremental learning; self-supervised learning;
D O I
10.1109/TAI.2024.3355879
中图分类号
TB18 [人体工程学]; Q98 [人类学];
学科分类号
030303 ; 1201 ;
摘要
Despite the impressive performance of deep learning models, they suffer from catastrophic forgetting, which refers to a significant decline in overall performance when trained with new classes added incrementally. The primary reason for this phenomenon is the overlapping or confusion between the feature space representations of old and new classes. In this study, we examine this issue and propose a model that can mitigate the problem by learning more transferable features. We employ contrastive learning, a recent breakthrough in deep learning, which can learn visual representations better than the task-specific supervision method. Specifically, we introduce an exemplar-based continual learning (CL) method using contrastive learning to learn a task-agnostic and continuously improved feature expression. However, the class imbalance between old and new samples in CL can affect the final learned features. To address this issue, we propose two approaches. First, we use a novel exemplar-based method, called determinantal point processes experience replay (DPPER), to improve buffer diversity during memory update. Second, we propose an old sample compensation weight (CW) to resist the corruption of the old model caused by new task learning during memory retrieval. Our experimental results on benchmark datasets demonstrate that our approach outperforms state-of-the-art methods in terms of comparable performance. © 2020 IEEE.
引用
收藏
页码:3313 / 3324
页数:11
相关论文
共 50 条
  • [1] Exemplar-Free Continual Representation Learning via Learnable Drift Compensation
    Gomez-Villa, Alex
    Goswami, Dipam
    Wang, Kai
    Bagdanov, Andrew D.
    Twardowski, Bartlomiej
    van de Weijer, Joost
    COMPUTER VISION-ECCV 2024, PT VII, 2025, 15065 : 473 - 490
  • [2] Contrastive Supervised Distillation for Continual Representation Learning
    Barletti, Tommaso
    Biondi, Niccolo
    Pernici, Federico
    Bruni, Matteo
    Del Bimbo, Alberto
    IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT I, 2022, 13231 : 597 - 609
  • [3] CONTRASTIVE LEARNING FOR ONLINE SEMI-SUPERVISED GENERAL CONTINUAL LEARNING
    Michel, Nicolas
    Negrel, Romain
    Chierchia, Giovanni
    Bercher, Jean-Francois
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 1896 - 1900
  • [4] Continual Learning by Contrastive Learning of Regularized Classes in Multivariate Gaussian Distributions
    Moon, Hyung-Jun
    Cho, Sung-Bae
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2025,
  • [5] Contrastive Learning of Multivariate Gaussian Distributions of Incremental Classes for Continual Learning
    Moon, Hyung-Jun
    Cho, Sung-Bae
    ARTIFICIAL INTELLIGENCE FOR NEUROSCIENCE AND EMOTIONAL SYSTEMS, PT I, IWINAC 2024, 2024, 14674 : 518 - 527
  • [6] Pedestrian Detection by Exemplar-Guided Contrastive Learning
    Lin, Zebin
    Pei, Wenjie
    Chen, Fanglin
    Zhang, David
    Lu, Guangming
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 2003 - 2016
  • [7] CCL: Continual Contrastive Learning for LiDAR Place Recognition
    Cui, Jiafeng
    Chen, Xieyuanli
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (08) : 4433 - 4440
  • [8] Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning
    Wu, Huisi
    Wang, Zhaoze
    Zhao, Zebin
    Chen, Cheng
    Qin, Jing
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (12) : 3794 - 3804
  • [9] A Contrastive Continual Learning for the Classification of Remote Sensing Imagery
    Alakooz, Abdulaziz S.
    Ammour, Nassim
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 7902 - 7905
  • [10] Learning Deep Representations via Contrastive Learning for Instance Retrieval
    Wu, Tao
    Luo, Tie
    Wunsch, Donald C., II
    2022 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2022, : 1501 - 1506