Active Instance Selection for Few-Shot Classification

被引:5
作者
Shin, Junsup [1 ]
Kang, Youngwook [1 ]
Jung, Seungjin [1 ]
Choi, Jongwon [1 ,2 ]
机构
[1] Chung Ang Univ, Dept Adv Imaging, GSAIM, Seoul, South Korea
[2] Chung Ang Univ, Dept Artificial Intelligence, Seoul 06973, South Korea
关键词
Few-shot learning; active learning; classification; DISTANCE;
D O I
10.1109/ACCESS.2022.3231365
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Few-shot learning aims to develop well-trained models by using only a few annotated samples. However, the performance of few-shot learning deteriorates if inappropriate support samples are selected. This happens because of its large dependency on only a few support samples, which limits the stable upgrade of performance even when additional training samples are provided. To overcome this issue, we introduce an active instance selection mechanism where effective support instances are chosen from the unlabeled pool to improve the performance of few-shot classification while maintaining the same annotation cost. Our mechanism comprises four processing steps: feature pre-processing, feature distribution fitting, instance scoring, and instance selection. First, the feature pre-processing normalizes the scattered features to obtain a well-defined distribution stably. Then, the feature distribution fitting finds the distributional relationship between the support images and the unlabeled query samples. Finally, in instance scoring and instance selection, we select the instances based on the instance-wise score estimated by using the fitted distributions. To verify the validity of the proposed mechanism, we employ various algorithms for the feature distribution fitting and the instance scoring. We perform several experiments by integrating the proposed mechanism into a variety of few-shot classification methods. Furthermore, we validate the effectiveness of our mechanism's components through extensive ablation studies. The experimental results verify that the proposed algorithm presents stable performance improvement with some additional labeling cost for the few-shot classification task. We also present the optimal methods to select effective samples to be labeled for the target few-shot classification algorithms.
引用
收藏
页码:133186 / 133195
页数:10
相关论文
共 35 条
[1]  
Boney R., 2019, PROC 27 EUR S ARTIF, P6
[2]  
Chang EN, 2021, ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, P8
[3]  
Chen W., 2019, P INT C LEARN REPR I
[4]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
[5]   THE FRECHET DISTANCE BETWEEN MULTIVARIATE NORMAL-DISTRIBUTIONS [J].
DOWSON, DC ;
LANDAU, BV .
JOURNAL OF MULTIVARIATE ANALYSIS, 1982, 12 (03) :450-455
[6]   Improved well log classification using semisupervised Gaussian mixture models and a new hyper-parameter selection strategy [J].
Dunham, Michael W. ;
Malcolm, Alison ;
Welford, J. Kim .
COMPUTERS & GEOSCIENCES, 2020, 140
[7]  
Finn C, 2017, PR MACH LEARN RES, V70
[8]  
Gal Y, 2017, PR MACH LEARN RES, V70
[9]  
Garcia-Acosta Virginia., 2018, INT C LEARN REPR, V1st, P1
[10]   Edge-Labeling Graph Neural Network for Few-shot Learning [J].
Kim, Jongmin ;
Kim, Taesup ;
Kim, Sungwoong ;
Yoo, Chang D. .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :11-20