Learning to Learn from Noisy Labeled Data

被引:220
作者
Li, Junnan [1 ]
Wong, Yongkang [1 ]
Zhao, Qi [2 ]
Kankanhalli, Mohan S. [1 ]
机构
[1] Natl Univ Singapore, Singapore, Singapore
[2] Univ Minnesota, Minneapolis, MN 55455 USA
来源
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019) | 2019年
基金
新加坡国家研究基金会;
关键词
D O I
10.1109/CVPR.2019.00519
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the success of deep neural networks (DNNs) in image classification tasks, the human-level performance relies on massive training data with high-quality manual annotations, which are expensive and time-consuming to collect. There exist many inexpensive data sources on the web, but they tend to contain inaccurate labels. Training on noisy labeled datasets causes performance degradation because DNNs can easily overfit to the label noise. To overcome this problem, we propose a noise-tolerant training algorithm, where a meta-learning update is performed prior to conventional gradient update. The proposed meta-learning method simulates actual training by generating synthetic noisy labels, and train the model such that after one gradient update using each set of synthetic noisy labels, the model does not overfit to the specific noise. We conduct extensive experiments on the noisy CIFAR-10 dataset and the Clothing 1M dataset. The results demonstrate the advantageous performance of the proposed method compared to state-of-the-art baselines.
引用
收藏
页码:5046 / 5054
页数:9
相关论文
共 32 条
[1]  
[Anonymous], 1CLR
[2]  
[Anonymous], NIPS WORKSH
[3]  
[Anonymous], PROC CVPR IEEE
[4]  
[Anonymous], 2018, PROC CVPR IEEE, DOI DOI 10.1109/CVPR.2018.00906
[5]  
[Anonymous], 2017, NIPS WORKSH
[6]  
[Anonymous], 2009, Learning Multiple Layers of Features from Tiny Images
[7]  
[Anonymous], 2017, CVPR, DOI DOI 10.1109/CVPR.2017.240
[8]  
[Anonymous], 2015, ICLR
[9]  
[Anonymous], 2017, ICLR
[10]  
[Anonymous], 2017, ICLR