Flexible few-shot class-incremental learning with prototype container

被引:2
作者
Xu, Xinlei [1 ,2 ]
Wang, Zhe [1 ,2 ]
Fu, Zhiling [1 ,2 ]
Guo, Wei [1 ,2 ]
Chi, Ziqiu [1 ,2 ]
Li, Dongdong [1 ,2 ]
机构
[1] East China Univ Sci & Technol, Key Lab Smart Mfg Energy Chem Proc, Minist Educ, Shanghai 200237, Peoples R China
[2] East China Univ Sci & Technol, Dept Comp Sci & Engn, Shanghai 200237, Peoples R China
关键词
Few-shot class-incremental learning; Few shot learning; Incremental learning; INFORMATION;
D O I
10.1007/s00521-023-08272-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the few-shot class- incremental learning, new class samples are utilized to learn the characteristics of new classes, while old class exemplars are used to avoid old knowledge forgetting. The limited number of new class samples is more likely to cause overfitting during incremental training. Moreover, mass stored old exemplars mean large storage space consumption. To solve the above difficulties, in this paper we propose a novel flexible few-shot class-incremental framework to make the incremental process efficient and convenient. We enhance the expression ability of extracted features through multistage pre-training. Then, we set up a prototype container to store each class prototype to retain old knowledge. When new classes flow in, we calculate the new class prototypes and update the prototype container. Finally, we get the prediction result through similarity weighting. The entire framework only need to train the base class classifier and does not require further training during the incremental process. It avoids the overfitting of novel classes and saves time for further training. Besides, storing prototypes can save more storage space than original image data. Overall, the entire framework has the advantage of flexibility. We conduct extensive experiments on three standard few-shot class-incremental datasets and achieve state-of-the-art results. Especially, to verify the flexibility of the framework, we discuss the special federated fewshot class-incremental scenarios in addition. No further training and less storage consumption provide the possibility for applications in more complex scenarios.
引用
收藏
页码:10875 / 10889
页数:15
相关论文
共 49 条
[11]  
Han S., 2016, Advances in Neural Information Processing Systems, P109, DOI 10.5555/3157096.3157109
[12]   Learning a Unified Classifier Incrementally via Rebalancing [J].
Hou, Saihui ;
Pan, Xinyu ;
Loy, Chen Change ;
Wang, Zilei ;
Lin, Dahua .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :831-839
[13]   Local descriptor-based multi-prototype network for few-shot Learning [J].
Huang, Hongwei ;
Wu, Zhangkai ;
Li, Wenbin ;
Huo, Jing ;
Gao, Yang .
PATTERN RECOGNITION, 2021, 116
[14]   Mental imagery classification using one-dimensional convolutional neural network for target selection in single-channel BCI-controlled mobile robot [J].
Izzuddin, Tarmizi Ahmad ;
Safri, Norlaili Mat ;
Othman, Mohd Afzan .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (11) :6233-6246
[15]  
Javed K., 2018, COMPUTER VISION ACCV, P3
[16]   Reweighting and information-guidance networks for Few-Shot Learning [J].
Ji, Zhong ;
Chai, Xingliang ;
Yu, Yunlong ;
Zhang, Zhongfei .
NEUROCOMPUTING, 2021, 423 :13-23
[17]   Towards Open World Object Detection [J].
Joseph, K. J. ;
Khan, Salman ;
Khan, Fahad Shahbaz ;
Balasubramanian, Vineeth N. .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :5826-5836
[18]  
Koch G, 2015, Siamese neural networks for one-shot image recognition, V2
[19]   Meta-learning baselines and database for few-shot classification in agriculture [J].
Li, Yang ;
Yang, Jiachen .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 182
[20]   Prototype Rectification for Few-Shot Learning [J].
Liu, Jinlu ;
Song, Liang ;
Qin, Yongqiang .
COMPUTER VISION - ECCV 2020, PT I, 2020, 12346 :741-756