Learning to teach and learn for semi-supervised few-shot image classification

被引:8
作者
Li, Xinzhe [1 ]
Huang, Jianqiang [2 ]
Liu, Yaoyao [3 ]
Zhou, Qin [2 ]
Zheng, Shibao [1 ]
Schiele, Bernt [3 ]
Sun, Qianru [4 ]
机构
[1] Shanghai Jiao Tong Univ, Inst Image Commun & Network Engn, Shanghai 201100, Peoples R China
[2] Alibaba DAMO Acad, Hangzhou 310012, Peoples R China
[3] Max Planck Inst Informat, Saarland Informat Campus, D-66123 Saarbrucken, Germany
[4] Singapore Management Univ, Sch Comp & Informat Syst, Singapore 178902, Singapore
基金
中国国家自然科学基金;
关键词
Few-shot learning; Meta-learning; Semi-supervised learning;
D O I
10.1016/j.cviu.2021.103270
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel semi-supervised few-shot image classification method named Learning to Teach and Learn (LTTL) to effectively leverage unlabeled samples in small-data regimes. Our method is based on self training, which assigns pseudo labels to unlabeled data. However, the conventional pseudo-labeling operation heavily relies on the initial model trained by using a handful of labeled data and may produce many noisy labeled samples. We propose to solve the problem with three steps: firstly, cherry-picking searches valuable samples from pseudo-labeled data by using a soft weighting network; and then, cross-teaching allows the classifiers to teach mutually for rejecting more noisy labels. A feature synthesizing strategy is introduced for cross-teaching to avoid clean samples being rejected by mistake; finally, the classifiers are fine-tuned with a few labeled data to avoid gradient drifts. We use the meta-learning paradigm to optimize the parameters in the whole framework. The proposed LTTL combines the power of meta-learning and self-training, achieving superior performance compared with the baseline methods on two public benchmarks.
引用
收藏
页数:10
相关论文
共 43 条
[1]  
[Anonymous], 2013, Advances in neural information processing systems
[2]  
[Anonymous], 2019, P 7 INT C LEARN REPR
[3]  
Antoniou A., 2019, 7 INT C LEARN REPR I, DOI DOI 10.1145/3351556.3351574
[4]  
Arazo E, 2019, PR MACH LEARN RES, V97
[5]  
Arpit D, 2017, PR MACH LEARN RES, V70
[6]  
Berthelot D, 2019, ADV NEUR IN, V32
[7]  
Chen W.Y., 2019, The Fractional Laplacian
[8]  
Dong-Hyun L., 2013, INT C MACH LEARN WOR
[9]  
Finn C, 2018, ADV NEUR IN, V31
[10]  
Finn C, 2017, PR MACH LEARN RES, V70