Learning from Noisy Labeled Samples Using Prediction Norm for Image Classification

被引:3
作者
Okamura, Daiki [1 ]
Harakawa, Ryosuke [1 ]
Iwahashi, Masahiro [1 ]
机构
[1] Nagaoka Univ Technol, Dept Elect Elect & Informat Engn, Nagaoka, Niigata 9402188, Japan
来源
2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC) | 2021年
关键词
D O I
10.1109/SMC52423.2021.9659001
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Learning of the convolutional neural network (CNN) from noisy labeled samples is a crucial problem, and many studies have been conducted. Although the state-of-the-art method called Joint training with Co-Regularization (JoCoR) has achieved high performance, it is a still challenging problem to accurately classify samples with asymmetric noise, i.e., wrong labels between similar classes (e.g., CAT <-> DOG). In this paper, we newly found that there is a difference between the prediction norm for clean labeled samples and that for noisy labeled samples. In addition, we found that there is a positive correlation between the prediction norm difference and the classification accuracy for the CIFAR-10 dataset. Therefore, we hypothesize that the discriminative power would be improved if we increase the prediction norm difference. Based on this hypothesis, a novel method for learning CNN from noisy labeled samples is proposed. Specifically, we take JoCoR as the base architecture and weight the loss function of JoCoR to make the prediction norm difference large. Experimental results for the CIFAR-10 dataset suggest that our hypothesis is correct. The classification accuracy by the proposed method is higher than some state-of-the-art methods including JoCoR for samples with asymmetric noise as well as those with symmetric noise (i.e., wrong labels between other classes).
引用
收藏
页码:214 / 219
页数:6
相关论文
共 37 条
[1]  
[Anonymous], 2005, PROC ICML WORKSHOP L
[2]  
[Anonymous], 2014, P INT C LEARN REPR D
[3]  
[Anonymous], 2018, PR MACH LEARN RES
[4]  
Arpit D, 2017, PR MACH LEARN RES, V70
[5]  
Blum A., 1998, Proceedings of the Eleventh Annual Conference on Computational Learning Theory, P92, DOI 10.1145/279943.279962
[6]   A Semi-Supervised Two-Stage Approach to Learning from Noisy Labels [J].
Ding, Yifan ;
Wang, Liqiang ;
Fan, Deliang ;
Gong, Boqing .
2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2018), 2018, :1215-1224
[7]  
Foret P., 2021, PROC INT C LEARNING
[8]   Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels [J].
Han, Bo ;
Yao, Quanming ;
Yu, Xingrui ;
Niu, Gang ;
Xu, Miao ;
Hu, Weihua ;
Tsang, Ivor W. ;
Sugiyama, Masashi .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
[9]  
Hendrycks D, 2018, ADV NEUR IN, V31
[10]  
Krizhevsky A, 2009, LEARNING MULTIPLE LA