CTL-I: Infrared Few-Shot Learning via Omnidirectional Compatible Class-Incremental

被引:2
作者
Yang, Biwen [1 ]
Zhang, Ruiheng [1 ]
Liu, Yumeng [2 ]
Liu, Guanyu [1 ]
Cao, Zhe [1 ]
Yang, Zhidong [1 ]
Yu, Heng [1 ]
Xu, Lixin [1 ]
机构
[1] Beijing Inst Technol, 5 Yard,Zhong Guan Cun South St, Beijing, Peoples R China
[2] Chinese Acad Sci, Inst Software, Beijing Key Lab Human Comp Interact, Beijing, Peoples R China
来源
BIG DATA TECHNOLOGIES AND APPLICATIONS, EAI INTERNATIONAL CONFERENCE, BDTA 2023 | 2024年 / 555卷
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Infrared; Few-shot Learning; Class-incremental Learning;
D O I
10.1007/978-3-031-52265-9_1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accommodating infrared novel class in deep learning models without sacrificing prior knowledge of base class is a challenging task, especially when the available data for the novel class is limited. Existing infrared few-shot learning methods mainly focus on measuring similarity between novel and base embedding spaces or transferring novel class features to base class feature spaces. To address this issue, we propose Infrared (omnidirectional) Compatibility Training Learning (CTL-I). We suggest building a virtual infrared prototype in the basic model to preserve feature space for potential new classes in advance. We use a method of coupling virtual and real data to gradually update these virtual prototypes as predictions for potential new categories, resulting in a more powerful classifier that can effectively adapt to new categories while retaining knowledge about general infrared features learned from the base class. Our empirical results demonstrate that our approach outperforms existing few-shot incremental learning methods on various benchmark datasets, even with extremely limited instances per class. Our work offers a promising direction for addressing the challenges of few-shot incremental learning in infrared image.
引用
收藏
页码:3 / 17
页数:15
相关论文
共 19 条
[1]  
Aljundi R, 2018, Arxiv, DOI [arXiv:1711.09601, 10.48550/arXiv.1711.09601, DOI 10.48550/ARXIV.1711.09601]
[2]  
Bansal G, 2019, AAAI CONF ARTIF INTE, P2429
[3]  
Chou Yu-Ying, 2021, INT C LEARNING REPRE
[4]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
[5]  
Faramarzi M, 2020, Arxiv, DOI arXiv:2006.07794
[6]   Catastrophic forgetting in connectionist networks [J].
French, RM .
TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) :128-135
[7]   A Survey on Ensemble Learning for Data Stream Classification [J].
Gomes, Heitor Murilo ;
Barddal, Jean Paul ;
Enembreck, Fabricio ;
Bifet, Albert .
ACM COMPUTING SURVEYS, 2017, 50 (02)
[8]  
Hinton G, 2015, Arxiv, DOI arXiv:1503.02531
[9]  
Kirkpatrick J, 2017, Arxiv, DOI arXiv:1612.00796
[10]   Learning without Forgetting [J].
Li, Zhizhong ;
Hoiem, Derek .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (12) :2935-2947