Memory Efficient Class-Incremental Learning for Image Classification

被引:47
|
作者
Zhao, Hanbin [1 ]
Wang, Hui [1 ]
Fu, Yongjian [1 ]
Wu, Fei [1 ]
Li, Xi [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Shanghai Inst Adv Study, Shanghai 201210, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Knowledge transfer; Data mining; Adaptation models; Training; Noise measurement; Knowledge engineering; Catastrophic forgetting; class-incremental learning (CIL); classification; exemplar; memory efficient;
D O I
10.1109/TNNLS.2021.3072041
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the memory-resource-limited constraints, class-incremental learning (CIL) usually suffers from the ``catastrophic forgetting'' problem when updating the joint classification model on the arrival of newly added classes. To cope with the forgetting problem, many CIL methods transfer the knowledge of old classes by preserving some exemplar samples into the size-constrained memory buffer. To utilize the memory buffer more efficiently, we propose to keep more auxiliary low-fidelity exemplar samples, rather than the original real-high-fidelity exemplar samples. Such a memory-efficient exemplar preserving scheme makes the old-class knowledge transfer more effective. However, the low-fidelity exemplar samples are often distributed in a different domain away from that of the original exemplar samples, that is, a domain shift. To alleviate this problem, we propose a duplet learning scheme that seeks to construct domain-compatible feature extractors and classifiers, which greatly narrows down the above domain gap. As a result, these low-fidelity auxiliary exemplar samples have the ability to moderately replace the original exemplar samples with a lower memory cost. In addition, we present a robust classifier adaptation scheme, which further refines the biased classifier (learned with the samples containing distillation label knowledge about old classes) with the help of the samples of pure true class labels. Experimental results demonstrate the effectiveness of this work against the state-of-the-art approaches. We will release the code, baselines, and training statistics for all models to facilitate future research.
引用
收藏
页码:5966 / 5977
页数:12
相关论文
共 50 条
  • [21] DYNAMIC REPLAY TRAINING FOR CLASS-INCREMENTAL LEARNING
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 5915 - 5919
  • [22] Class-Incremental Learning of Convolutional Neural Networks Based on Double Consolidation Mechanism
    Jin, Leilei
    Liang, Hong
    Yang, Changsheng
    IEEE ACCESS, 2020, 8 : 172553 - 172562
  • [23] Curiosity-Driven Class-Incremental Learning via Adaptive Sample Selection
    Hu, Qinghua
    Gao, Yucong
    Cao, Bing
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (12) : 8660 - 8673
  • [24] Class-Incremental Learning Based on Anomaly Detection
    Zhang, Lijuan
    Yang, Xiaokang
    Zhang, Kai
    Li, Yong
    Li, Fu
    Li, Jun
    Li, Dongming
    IEEE ACCESS, 2023, 11 : 69423 - 69438
  • [25] Exemplar-Supported Representation for Effective Class-Incremental Learning
    Guo, Lei
    Xie, Gang
    Xu, Xinying
    Ren, Jinchang
    IEEE ACCESS, 2020, 8 : 51276 - 51284
  • [26] BEFM: A balanced and efficient fine-tuning model in class-incremental learning
    Liu, Lize
    Ji, Jian
    Zhao, Lei
    KNOWLEDGE-BASED SYSTEMS, 2025, 315
  • [27] General Federated Class-Incremental Learning With Lightweight Generative Replay
    Chen, Yuanlu
    Tan, Alysa Ziying
    Feng, Siwei
    Yu, Han
    Deng, Tao
    Zhao, Libang
    Wu, Feng
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (20): : 33927 - 33939
  • [28] Class incremental learning with analytic learning for hyperspectral image classification
    Zhuang, Huiping
    Yan, Yue
    He, Run
    Zeng, Ziqian
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2024, 361 (18):
  • [29] A Class-Incremental Learning Method for PCB Defect Detection
    Ge, Quanbo
    Wu, Ruilin
    Wu, Yupei
    Liu, Huaping
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
  • [30] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    NEURAL NETWORKS, 2024, 169 : 307 - 324