Few Shot Class Incremental Learning via Efficient Prototype Replay and Calibration

被引:7
作者
Zhang, Wei [1 ]
Gu, Xiaodong [1 ]
机构
[1] Fudan Univ, Dept Elect Engn, Shanghai 200438, Peoples R China
基金
中国国家自然科学基金;
关键词
few shot learning; incremental learning; meta-learning; feature replay; prototype calibration;
D O I
10.3390/e25050776
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Few shot class incremental learning (FSCIL) is an extremely challenging but valuable problem in real-world applications. When faced with novel few shot tasks in each incremental stage, it should take into account both catastrophic forgetting of old knowledge and overfitting of new categories with limited training data. In this paper, we propose an efficient prototype replay and calibration (EPRC) method with three stages to improve classification performance. We first perform effective pre-training with rotation and mix-up augmentations in order to obtain a strong backbone. Then a series of pseudo few shot tasks are sampled to perform meta-training, which enhances the generalization ability of both the feature extractor and projection layer and then helps mitigate the over-fitting problem of few shot learning. Furthermore, an even nonlinear transformation function is incorporated into the similarity computation to implicitly calibrate the generated prototypes of different categories and alleviate correlations among them. Finally, we replay the stored prototypes to relieve catastrophic forgetting and rectify prototypes to be more discriminative in the incremental-training stage via an explicit regularization within the loss function. The experimental results on CIFAR-100 and miniImageNet demonstrate that our EPRC significantly boosts the classification performance compared with existing mainstream FSCIL methods.
引用
收藏
页数:17
相关论文
共 54 条
[1]  
Rusu AA, 2019, Arxiv, DOI arXiv:1807.05960
[2]  
Chen JF, 2022, AAAI CONF ARTIF INTE, P10492
[3]  
Chi Z., 2022, P IEEECVF C COMPUTER, P14166
[4]   Learning without Memorizing [J].
Dhar, Prithviraj ;
Singh, Rajat Vikram ;
Peng, Kuan-Chuan ;
Wu, Ziyan ;
Chellappa, Rama .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :5133-5141
[5]   PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning [J].
Douillard, Arthur ;
Cord, Matthieu ;
Ollion, Charles ;
Robert, Thomas ;
Valle, Eduardo .
COMPUTER VISION - ECCV 2020, PT XX, 2020, 12365 :86-102
[6]  
Finn C, 2017, PR MACH LEARN RES, V70
[7]  
Garcia V., 2017, Few-shot learning with graph neural networks
[8]   Learning a Unified Classifier Incrementally via Rebalancing [J].
Hou, Saihui ;
Pan, Xinyu ;
Loy, Chen Change ;
Wang, Zilei ;
Lin, Dahua .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :831-839
[9]   Leveraging the Feature Distribution in Transfer-Based Few-Shot Learning [J].
Hu, Yuqing ;
Gripon, Vincent ;
Pateux, Stephane .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT II, 2021, 12892 :487-499
[10]   Memory-Efficient Incremental Learning Through Feature Adaptation [J].
Iscen, Ahmet ;
Zhang, Jeffrey ;
Lazebnik, Svetlana ;
Schmid, Cordelia .
COMPUTER VISION - ECCV 2020, PT XVI, 2020, 12361 :699-715