FRMM: Feature Reprojection for Exemplar-Free Class-Incremental Learning

被引:0
|
作者
Wang, Hao [1 ]
Chen, Jing [1 ,2 ]
机构
[1] Jiangnan Univ, Sch Artificial Intelligence & Comp Sci, Wuxi 214122, Jiangsu, Peoples R China
[2] Jiangnan Univ, Jiangsu Prov Engn Lab Pattern Recognit & Comp Int, Wuxi 214122, Jiangsu, Peoples R China
来源
ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024 | 2024年 / 14864卷
关键词
incremental learning; continual learning; catastrophic forgetting; ensemble learning;
D O I
10.1007/978-981-97-5588-2_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental learning involves handling a sequence of classification tasks arriving chronologically, with each task containing different classes. The model must gradually expand its knowledge by learning these tasks successively to acquire the ability to classify all encountered classes. Unlike exemplar-based class-incremental learning (EBCIL) which allows storing some old samples, exemplar-free class-incremental learning (EFCIL) faces a more severe forgetting problem due to the complete prohibition on accessing old data. Some previous methods freeze the feature extractor after the initial stage to enhance the model's stability on the base classes, but this also leads to a larger distribution discrepancy between the feature vectors of base and incremental classes. We propose to insert a reprojection layer between the feature extractor and the classifier to project feature vectors onto a unified distribution, facilitating subsequent classification. Feature reprojection improves the performance of both linear and nearest class mean (NCM) classifier, but their outputs exhibit different biases. We treat them as different experts by ensembling their predictions to combine their strengths. Additionally, we introduce a momentum update to alleviate the linear classifier's inherent forgetting. Across three public datasets of varying scales, our method, named FRMM, outperforms others in most cases while requiring only 1/256 of the storage by the state-of-the-art method that saves a covariance matrix for each class. Code is available at https://github.com/BlacknoSheep/CIL.
引用
收藏
页码:251 / 263
页数:13
相关论文
共 50 条
  • [31] Continual prune-and-select: class-incremental learning with specialized subnetworks
    Aleksandr Dekhovich
    David M.J. Tax
    Marcel H.F Sluiter
    Miguel A. Bessa
    Applied Intelligence, 2023, 53 : 17849 - 17864
  • [32] Rethinking Few-Shot Class-Incremental Learning: Learning from Yourself
    Tang, Yu-Ming
    Peng, Yi-Xing
    Meng, Jingke
    Zheng, Wei-Shi
    COMPUTER VISION - ECCV 2024, PT LXI, 2025, 15119 : 108 - 128
  • [33] A Class-Incremental Learning Method for PCB Defect Detection
    Ge, Quanbo
    Wu, Ruilin
    Wu, Yupei
    Liu, Huaping
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
  • [34] Continual prune-and-select: class-incremental learning with specialized subnetworks
    Dekhovich, Aleksandr
    Tax, David M. J.
    Sluiter, Marcel H. F.
    Bessa, Miguel A.
    APPLIED INTELLIGENCE, 2023, 53 (14) : 17849 - 17864
  • [35] Class-Incremental Learning with Cross-Space Clustering and Controlled Transfer
    Ashok, Arjun
    Joseph, K. J.
    Balasubramanian, Vineeth N.
    COMPUTER VISION - ECCV 2022, PT XXVII, 2022, 13687 : 105 - 122
  • [36] KABI: Class-Incremental Learning via knowledge Amalgamation and Batch Identification
    Li, Caixia
    Xu, Wenhua
    Si, Xizhu
    Song, Ping
    2021 5TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE (ICIAI 2021), 2021, : 170 - 176
  • [37] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    NEURAL NETWORKS, 2024, 169 : 307 - 324
  • [38] Class-Incremental Learning with Topological Schemas of Memory Spaces
    Chang, Xinyuan
    Tao, Xiaoyu
    Hong, Xiaopeng
    Wei, Xing
    Ke, Wei
    Gong, Yihong
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9719 - 9726
  • [39] Graph Few-shot Class-incremental Learning
    Tan, Zhen
    Ding, Kaize
    Guo, Ruocheng
    Liu, Huan
    WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, : 987 - 996
  • [40] Deep Class-Incremental Learning From Decentralized Data
    Zhang, Xiaohan
    Dong, Songlin
    Chen, Jinjie
    Tian, Qi
    Gong, Yihong
    Hong, Xiaopeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (05) : 7190 - 7203