FRMM: Feature Reprojection for Exemplar-Free Class-Incremental Learning

被引:0
|
作者
Wang, Hao [1 ]
Chen, Jing [1 ,2 ]
机构
[1] Jiangnan Univ, Sch Artificial Intelligence & Comp Sci, Wuxi 214122, Jiangsu, Peoples R China
[2] Jiangnan Univ, Jiangsu Prov Engn Lab Pattern Recognit & Comp Int, Wuxi 214122, Jiangsu, Peoples R China
来源
ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024 | 2024年 / 14864卷
关键词
incremental learning; continual learning; catastrophic forgetting; ensemble learning;
D O I
10.1007/978-981-97-5588-2_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental learning involves handling a sequence of classification tasks arriving chronologically, with each task containing different classes. The model must gradually expand its knowledge by learning these tasks successively to acquire the ability to classify all encountered classes. Unlike exemplar-based class-incremental learning (EBCIL) which allows storing some old samples, exemplar-free class-incremental learning (EFCIL) faces a more severe forgetting problem due to the complete prohibition on accessing old data. Some previous methods freeze the feature extractor after the initial stage to enhance the model's stability on the base classes, but this also leads to a larger distribution discrepancy between the feature vectors of base and incremental classes. We propose to insert a reprojection layer between the feature extractor and the classifier to project feature vectors onto a unified distribution, facilitating subsequent classification. Feature reprojection improves the performance of both linear and nearest class mean (NCM) classifier, but their outputs exhibit different biases. We treat them as different experts by ensembling their predictions to combine their strengths. Additionally, we introduce a momentum update to alleviate the linear classifier's inherent forgetting. Across three public datasets of varying scales, our method, named FRMM, outperforms others in most cases while requiring only 1/256 of the storage by the state-of-the-art method that saves a covariance matrix for each class. Code is available at https://github.com/BlacknoSheep/CIL.
引用
收藏
页码:251 / 263
页数:13
相关论文
共 50 条
  • [21] Future-proofing class-incremental learning
    Jodelet, Quentin
    Liu, Xin
    Phua, Yin Jun
    Murata, Tsuyoshi
    MACHINE VISION AND APPLICATIONS, 2025, 36 (01)
  • [22] Class-Incremental Learning Based on Anomaly Detection
    Zhang, Lijuan
    Yang, Xiaokang
    Zhang, Kai
    Li, Yong
    Li, Fu
    Li, Jun
    Li, Dongming
    IEEE ACCESS, 2023, 11 : 69423 - 69438
  • [23] DSS: A Diverse Sample Selection Method to Preserve Knowledge in Class-Incremental Learning
    Nokhwal, Sahil
    Kumar, Nirman
    2023 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2023, : 178 - 182
  • [24] Class-Incremental Continual Learning Into the eXtended DER-Verse
    Boschini, Matteo
    Bonicelli, Lorenzo
    Buzzega, Pietro
    Porrello, Angelo
    Calderara, Simone
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) : 5497 - 5512
  • [25] Class-Incremental Learning via Knowledge Amalgamation
    de Carvalho, Marcus
    Pratama, Mahardhika
    Zhang, Jie
    Sun, Yajuan
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 36 - 50
  • [26] Class-incremental learning with causal relational replay
    Nguyen, Toan
    Kieu, Duc
    Duong, Bao
    Kieu, Tung
    Do, Kien
    Nguyen, Thin
    Le, Bac
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250
  • [27] Opportunistic Dynamic Architecture for Class-Incremental Learning
    Rahman, Fahrurrozi
    Rosales Sanabria, Andrea
    Ye, Juan
    IEEE Access, 2025, 13 : 59146 - 59156
  • [28] Class-Incremental Learning Method Based on Feature Space Augmented Replay and Bias Correction
    Sun, Xiaopeng
    Yu, Lu
    Xu, Changsheng
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2024, 37 (08): : 729 - 740
  • [29] Class-Incremental Learning Based on Feature Extraction of CNN with Optimized Softmax and One-Class Classifiers
    Ye, Xin
    Zhu, Qiuyu
    IEEE ACCESS, 2019, 7 : 42024 - 42031
  • [30] Exemplar-Free Continual Representation Learning via Learnable Drift Compensation
    Gomez-Villa, Alex
    Goswami, Dipam
    Wang, Kai
    Bagdanov, Andrew D.
    Twardowski, Bartlomiej
    van de Weijer, Joost
    COMPUTER VISION-ECCV 2024, PT VII, 2025, 15065 : 473 - 490