Enhancing Metric-Based Few-Shot Classification With Weighted Large Margin Nearest Center Loss

被引:4
作者
Bao, Wei [1 ,2 ]
Huang, Meiyu [1 ]
Xiang, Xueshuang [1 ]
机构
[1] China Acad Space Technol, Qian Xuesen Lab Space Technol, Beijing 100094, Peoples R China
[2] Beijing Inst Technol, Sch Informat & Elect Engn, Beijing 100081, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Training; Extraterrestrial measurements; Cats; Additives; Prototypes; Optimization; Few-shot classification; metric learning; large margin nearest center loss; weighted large margin nearest center loss;
D O I
10.1109/ACCESS.2021.3091704
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Metric-learning-based methods, which attempt to learn a deep embedding space on extremely large episodes, have been successfully applied to few-shot classification problems. In this paper, we propose the adoption of large margin nearest center (LMNC) loss during episodic training to enhance metric-learning-based few-shot classification methods. Loss functions (such as cross-entropy and mean square error) commonly used in episodic training strive to achieve the strict goal that differently labeled examples in the embedding space are separated by an infinite distance. However, the learned embedding space cannot guarantee that this goal will be achieved for every episode sampled from a large number of classes. Instead of an infinite distance, LMNC loss requires only that differently labeled examples be separated by a large margin, which can well relax the strict constraint of the traditional loss functions, easily leading to a discriminative embedding space. Moreover, considering the multilevel similarity between various classes, we alleviate the constraint of a fixed large margin and extend LMNC loss to weighted LMNC (WLMNC) loss, which can effectively take advantage of interclass information, achieving a more separable embedding space with adaptive interclass margins. Experiments on state-of-the-art benchmarks demonstrate that the adoption of LMNC and WLMNC losses can strongly improve the embedding learning performance and classification accuracy of metric-based few-shot classification methods for various few-shot scenarios. In particular, LMNC and WLMNC losses can obtain 1.86% and 2.46% gains in prototypical network on miniImageNet for 5-way 1-shot scenario, respectively.
引用
收藏
页码:90805 / 90815
页数:11
相关论文
共 45 条
[1]  
[Anonymous], 2018, INT C LEARN REPR ICL
[2]  
[Anonymous], 2018, 6 INT C LEARNING REP
[3]  
Bertinetto L., 2019, INT C LEARN REPR
[4]  
Chen Y, 2020, A new meta-baseline for few-shot learning
[5]   Large Margin Prototypical Network for Few-shot Relation Classification with Fine-grained Features [J].
Fan, Miao ;
Bai, Yeqi ;
Sun, Mingming ;
Li, Ping .
PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, :2353-2356
[6]  
Finn C, 2017, PR MACH LEARN RES, V70
[7]  
Garcia Victor., 2017, ARXIV171104043
[8]   Boosting Few-Shot Visual Learning with Self-Supervision [J].
Gidaris, Spyros ;
Bursuc, Andrei ;
Komodakis, Nikos ;
Perez, Patrick ;
Cord, Matthieu .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :8058-8067
[9]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[10]   A LIGHTWEIGHT NEURAL NETWORK BASED HUMAN DEPTH RECOVERY METHOD [J].
Huang, Meiyu ;
Xiang, Xueshuang ;
Xu, Yao ;
Chen, Yiqiang .
2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, :1834-1839