Few-shot class incremental learning via prompt transfer and knowledge distillation

被引:1
作者
Akmel, Feidu [1 ]
Meng, Fanman [1 ]
Liu, Mingyu
Zhang, Runtong [1 ]
Teka, Asebe [1 ]
Lemuye, Elias [2 ]
机构
[1] Univ Elect Sci & Technol China, Sch Informat & Commun Engn, Chengdu, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Comp Sci, Chengdu, Peoples R China
关键词
Knowledge distillation; Prompting; Few-shot learning; Incremental learning;
D O I
10.1016/j.imavis.2024.105251
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The ability of a model to learn incrementally from very limited data while still retaining knowledge about previously seen classes is called few-shot incremental learning. The challenge of the few-shot learning model is data overfitting while the challenge of incremental learning models is catastrophic forgetting. To address these problems, we propose a distillation algorithm coupled with prompting, which effectively addresses the problem encountered in few-shot class-incremental learning by facilitating the transfer of distilled knowledge from a source to a target prompt. Furthermore, we employ a feature embedding module that monitors the semantic similarity between the input labels and the semantic vectors. This enables the learners to receive additional guidance, thereby mitigating the occurrence of catastrophic forgetting and overfitting. As our third contribution, we introduce an attention-based knowledge distillation method that learns relative similarities between features by creating effective links between teacher and student. This enables the regulation of the distillation intensities of all potential pairs between teacher and student. To validate the effectiveness of our proposed method, we conducted extensive experiments on diverse datasets, including miniImageNet, CIFAR100, and CUB200. The results of these experiments demonstrated that our method achieves state-of-the-art performance.
引用
收藏
页数:9
相关论文
共 47 条
[1]   Class similarity weighted knowledge distillation for few shot incremental learning [J].
Akmel, Feidu ;
Meng, Fanman ;
Wu, Qingbo ;
Chen, Shuai ;
Zhang, Runtong ;
Assefa, Maregu .
NEUROCOMPUTING, 2024, 584
[2]   Semantic Prompt for Few-Shot Image Recognition [J].
Chen, Wentao ;
Si, Chenyang ;
Zhang, Zhang ;
Wang, Liang ;
Wang, Zilei ;
Tan, Tieniu .
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, :23581-23591
[3]   Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning [J].
Cheraghian, Ali ;
Rahman, Shafin ;
Fang, Pengfei ;
Roy, Soumava Kumar ;
Petersson, Lars ;
Harandi, Mehrtash .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :2534-2543
[4]   MetaFSCEL A Meta-Learning Approach for Few-Shot Class Incremental Learning [J].
Chi, Zhixiang ;
Gu, Li ;
Liu, Huan ;
Wang, Yang ;
Yu, Yuanhao ;
Tang, Jin .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, :14146-14155
[5]   Multimodal Parameter-Efficient Few-Shot Class Incremental Learning [J].
D'Alessandro, Marco ;
Alonso, Alberto ;
Calabres, Enrique ;
Galar, Mikel .
2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, :3385-3395
[6]   A Continual Learning Survey: Defying Forgetting in Classification Tasks [J].
De Lange, Matthias ;
Aljundi, Rahaf ;
Masana, Marc ;
Parisot, Sarah ;
Jia, Xu ;
Leonardis, Ales ;
Slabaugh, Greg ;
Tuytelaars, Tinne .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) :3366-3385
[7]  
Dong SL, 2021, AAAI CONF ARTIF INTE, V35, P1255
[8]  
Dosovitskiy Alexey., 2021, PROC INT C LEARN REP, P2021
[9]   Knowledge Distillation: A Survey [J].
Gou, Jianping ;
Yu, Baosheng ;
Maybank, Stephen J. ;
Tao, Dacheng .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (06) :1789-1819
[10]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778