Autonomous perception and adaptive standardization for few-shot learning

被引:4
作者
Zhang, Yourun [1 ]
Gong, Maoguo [1 ]
Li, Jianzhao [1 ]
Feng, Kaiyuan [1 ]
Zhang, Mingyang [1 ]
机构
[1] Xidian Univ, Key Lab Collaborat Intelligence Syst, Minist Educ, 2 South TaiBai Rd, Xian 710071, Peoples R China
基金
中国国家自然科学基金;
关键词
Few-shot learning; Image classification; Deep learning; Feature extraction; RAT MODEL; NETWORK; ALIGNMENT;
D O I
10.1016/j.knosys.2023.110746
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Identifying unseen classes with limited labeled data for reference is a challenging task, which is also known as few-shot learning. Generally, a knowledge-rich model is more robust than a knowledge-poor model when facing novel situations, and an intuitive way to enrich knowledge is to find additional training data, but this is not compatible with the principle of few-shot learning which aims to reduce reliance on big data. In contrast, improving the utilization of existing data is a more attractive option. In this paper, we propose a batch perception distillation approach, which improves the utilization of existing data by guiding individual classification with the intermixed information across a batch. In addition to data utilization, obtaining robust feature representation is also a concern. Specifically, the widely adopted metric-based few-shot classification approach classifies unseen testing classes by comparing the extracted features of different novel samples, which requires that the extracted features can accurately represent the class-related clues of the input images. In this paper, we propose a salience perception attention that enables the model to focus more easily on key clues in images, which helps to reduce the interference of irrelevant factors during classification. To overcome the distribution gap between the training classes and the unseen testing classes, we propose a weighted centering post-processing that standardizes the testing data according to the similarity between the training and testing classes. By combining the three proposed components, our method achieves superior performance on four widely used few-shot image classification datasets.(c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 92 条
[71]  
Xu W., 2020, P INT C LEARN REPR, P1
[72]   A New Diagnosis Method with Few-shot Learning Based on a Class-rebalance Strategy for Scarce Faults in Industrial Processes [J].
Xu, Xinyao ;
Xu, De ;
Qin, Fangbo .
MACHINE INTELLIGENCE RESEARCH, 2023, 20 (04) :583-594
[73]   Weakly Correlated Knowledge Integration for Few-shot Image Classification [J].
Yang, Chun ;
Liu, Chang ;
Yin, Xu-Cheng .
MACHINE INTELLIGENCE RESEARCH, 2022, 19 (01) :24-37
[74]  
Yang LX, 2021, PR MACH LEARN RES, V139
[75]  
Yang S., 2021, P INT C LEARN REPR, P1
[76]  
Ye HJ, 2020, PROC CVPR IEEE, P8805, DOI 10.1109/CVPR42600.2020.00883
[77]   Cost-effective CNNs-based prototypical networks for few-shot relation classification across domains [J].
Yin, Gongzhu ;
Wang, Xing ;
Zhang, Hongli ;
Wang, Jinlin .
KNOWLEDGE-BASED SYSTEMS, 2022, 253
[78]  
Yoon SW, 2019, PR MACH LEARN RES, V97
[79]   CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features [J].
Yun, Sangdoo ;
Han, Dongyoon ;
Oh, Seong Joon ;
Chun, Sanghyuk ;
Choe, Junsuk ;
Yoo, Youngjoon .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :6022-6031
[80]  
Zagoruyko S., 2016, P BRIT MACH VIS C BM, DOI DOI 10.5244/C.30.87