FRMM: Feature Reprojection for Exemplar-Free Class-Incremental Learning

被引:0
|
作者
Wang, Hao [1 ]
Chen, Jing [1 ,2 ]
机构
[1] Jiangnan Univ, Sch Artificial Intelligence & Comp Sci, Wuxi 214122, Jiangsu, Peoples R China
[2] Jiangnan Univ, Jiangsu Prov Engn Lab Pattern Recognit & Comp Int, Wuxi 214122, Jiangsu, Peoples R China
来源
ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024 | 2024年 / 14864卷
关键词
incremental learning; continual learning; catastrophic forgetting; ensemble learning;
D O I
10.1007/978-981-97-5588-2_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental learning involves handling a sequence of classification tasks arriving chronologically, with each task containing different classes. The model must gradually expand its knowledge by learning these tasks successively to acquire the ability to classify all encountered classes. Unlike exemplar-based class-incremental learning (EBCIL) which allows storing some old samples, exemplar-free class-incremental learning (EFCIL) faces a more severe forgetting problem due to the complete prohibition on accessing old data. Some previous methods freeze the feature extractor after the initial stage to enhance the model's stability on the base classes, but this also leads to a larger distribution discrepancy between the feature vectors of base and incremental classes. We propose to insert a reprojection layer between the feature extractor and the classifier to project feature vectors onto a unified distribution, facilitating subsequent classification. Feature reprojection improves the performance of both linear and nearest class mean (NCM) classifier, but their outputs exhibit different biases. We treat them as different experts by ensembling their predictions to combine their strengths. Additionally, we introduce a momentum update to alleviate the linear classifier's inherent forgetting. Across three public datasets of varying scales, our method, named FRMM, outperforms others in most cases while requiring only 1/256 of the storage by the state-of-the-art method that saves a covariance matrix for each class. Code is available at https://github.com/BlacknoSheep/CIL.
引用
收藏
页码:251 / 263
页数:13
相关论文
共 50 条
  • [41] Memory Efficient Class-Incremental Learning for Image Classification
    Zhao, Hanbin
    Wang, Hui
    Fu, Yongjian
    Wu, Fei
    Li, Xi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5966 - 5977
  • [42] FDCNet: Feature Drift Compensation Network for Class-Incremental Weakly Supervised Object Localization
    Park, Sejin
    Lee, Taehyung
    Lee, Yeejin
    Kang, Byeongkeun
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 2045 - 2053
  • [43] FedINC: An Exemplar-Free Continual Federated Learning Framework with Small Labeled Data
    Deng, Yongheng
    Yue, Sheng
    Wang, Tuowei
    Wang, Guanbo
    Ren, Ju
    Zhang, Yaoxue
    PROCEEDINGS OF THE 21ST ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2023, 2023, : 56 - 69
  • [44] Class Bias Correction Matters: A Class-Incremental Learning Framework for Remote Sensing Scene Classification
    Wei, Yunze
    Pan, Zongxu
    Wu, Yirong
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [45] On Class-Incremental Learning for Fully Binarized Convolutional Neural Networks
    Basso-Bert, Yanis
    Guiequero, William
    Molnos, Anca
    Lemaire, Romain
    Dupret, Antoine
    2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
  • [46] CBCL-PR: A Cognitively Inspired Model for Class-Incremental Learning in Robotics
    Ayub, Ali
    Wagner, Alan R.
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2023, 15 (04) : 2004 - 2013
  • [47] Class-Incremental Learning Based on Big Dataset Pre-Trained Models
    Wen, Bin
    Zhu, Qiuyu
    IEEE ACCESS, 2023, 11 : 62028 - 62038
  • [48] Class-Incremental Learning With Deep Generative Feature Replay for DNA Methylation-Based Cancer Classification
    Batbaatar, Erdenebileg
    Park, Kwang Ho
    Amarbayasgalan, Tsatsral
    Davagdorj, Khishigsuren
    Munkhdalai, Lkhagvadorj
    Pham, Van-Huy
    Ryu, Keun Ho
    IEEE ACCESS, 2020, 8 (08): : 210800 - 210815
  • [49] An empirical evaluation of tinyML architectures for Class-Incremental Continual Learning
    Tremonti, Matteo
    Pezze, Davide Dalle
    Paissan, Francesco
    Farella, Elisabetta
    Susto, Gian Antonio
    2024 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS AND OTHER AFFILIATED EVENTS, PERCOM WORKSHOPS, 2024, : 690 - 695
  • [50] A robust and anti-forgettiable model for class-incremental learning
    Jianting Chen
    Yang Xiang
    Applied Intelligence, 2023, 53 : 14128 - 14145