Overcomplete-to-sparse representation learning for few-shot class-incremental learning

被引:0
作者
Fu, Mengying [1 ]
Liu, Binghao [1 ]
Ma, Tianren [1 ]
Ye, Qixiang [1 ]
机构
[1] Univ Chinese Acad Sci, Sch Elect Elect & Commun Engn, Beijing 100049, Peoples R China
基金
中国国家自然科学基金;
关键词
Few-shot; Class-incremental learning; Overcomplete-to-sparse; Representation learning;
D O I
10.1007/s00530-024-01294-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Few-shot class-incremental learning (FSCIL) aims to continually learn new semantics given a few training samples of new classes. As training examples are too few to construct good representation upon, FSCIL is required to generalize learned semantics from old to new classes, as well as reduce the representation aliasing between them (old classes 'forgetting'). This motivates us to develop overcomplete-to-sparse representation learning (O2SRL). It solves the 'new class generalization' and 'old class forgetting' problems systematically by regularizing both feature completeness and sparsity. Specifically, O2SRL consists of a spatial excitation module (SEM) and a channel purification module (CPM). SEM drives the model to learn overcomplete and generic features, which not only represent all classes well but also benefit generalization to new classes. CPM regularizes the sparsity and uniqueness of features, reducing semantic aliasing between classes and alleviating the forgetting of old classes. These two modules facilitate each other to configure unique and robust representation for both old and new classes. Experiments show that O2SRL improves the state-of-the-art of FSCIL by significant margins on public datasets including CUB200, CIFAR100, and mini-ImageNet. O2SRL's effectiveness is also validated under the general few-shot learning setting.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Few-shot class incremental learning via robust transformer approach
    Paeedeh, Naeem
    Pratama, Mahardhika
    Wibirama, Sunu
    Mayer, Wolfgang
    Cao, Zehong
    Kowalczyk, Ryszard
    INFORMATION SCIENCES, 2024, 675
  • [32] Sparse personalized federated class-incremental learning
    Liu, Youchao
    Huang, Dingjiang
    INFORMATION SCIENCES, 2025, 706
  • [33] Few-Shot Class Incremental Learning for Packaging Defects with Double Class Feature Interaction
    Zhu, Liming
    Zhang, Bo
    Zhao, Dan
    Hu, Hongtao
    PROCEEDINGS OF THE 36TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC 2024, 2024, : 5113 - 5118
  • [34] Class-Incremental Generalized Zero-Shot Learning
    Zhenfeng Sun
    Rui Feng
    Yanwei Fu
    Multimedia Tools and Applications, 2023, 82 : 38233 - 38247
  • [35] Class-Incremental Generalized Zero-Shot Learning
    Sun, Zhenfeng
    Feng, Rui
    Fu, Yanwei
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (24) : 38233 - 38247
  • [36] Coarse-To-Fine Incremental Few-Shot Learning
    Xiang, Xiang
    Tan, Yuwen
    Wan, Qian
    Ma, Jing
    Yuille, Alan
    Hager, Gregory D.
    COMPUTER VISION, ECCV 2022, PT XXXI, 2022, 13691 : 205 - 222
  • [37] Few-shot Class-incremental Audio Classification Using Adaptively-refined Prototypes
    Xie, Wei
    Li, Yanxiong
    He, Qianhua
    Cao, Wenchang
    Virtanen, Tuomas
    INTERSPEECH 2023, 2023, : 301 - 305
  • [38] Identification of chrysanthemum using hyperspectral imaging based on few-shot class incremental learning
    Cai, Zeyi
    He, Mengyu
    Li, Cheng
    Qi, Hengnian
    Bai, Ruibin
    Yang, Jian
    Zhang, Chu
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2023, 215
  • [39] Decoupled Two-Phase Framework for Class-Incremental Few-Shot Named Entity Recognition
    Chen, Yifan
    Huang, Zhen
    Hu, Minghao
    Li, Dongsheng
    Wang, Changjian
    Liu, Feng
    Lu, Xicheng
    TSINGHUA SCIENCE AND TECHNOLOGY, 2023, 28 (05): : 976 - 987
  • [40] Class-Incremental Learning: A Survey
    Zhou, Da-Wei
    Wang, Qi-Wei
    Qi, Zhi-Hong
    Ye, Han-Jia
    Zhan, De-Chuan
    Liu, Ziwei
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 9851 - 9873