A Framework Using Contrastive Learning for Classification with Noisy Labels

被引:6
作者
Ciortan, Madalina [1 ]
Dupuis, Romain [1 ]
Peel, Thomas [1 ]
机构
[1] EURA NOVA, R&D Dept, B-1435 Mons, Belgium
关键词
noisy labels; image classification; contrastive learning; robust loss;
D O I
10.3390/data6060061
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose a framework using contrastive learning as a pre-training task to perform image classification in the presence of noisy labels. Recent strategies, such as pseudo-labeling, sample selection with Gaussian Mixture models, and weighted supervised contrastive learning have, been combined into a fine-tuning phase following the pre-training. In this paper, we provide an extensive empirical study showing that a preliminary contrastive learning step brings a significant gain in performance when using different loss functions: non robust, robust, and early-learning regularized. Our experiments performed on standard benchmarks and real-world datasets demonstrate that: (i) the contrastive pre-training increases the robustness of any loss function to noisy labels and (ii) the additional fine-tuning phase can further improve accuracy, but at the cost of additional complexity.
引用
收藏
页数:26
相关论文
共 51 条
[1]  
Arazo E, 2019, PR MACH LEARN RES, V97
[2]  
Arpit D, 2017, PR MACH LEARN RES, V70
[3]  
Berthelot D, 2019, ADV NEUR IN, V32
[4]   Benchmark Analysis of Representative Deep Neural Network Architectures [J].
Bianco, Simone ;
Cadene, Remi ;
Celona, Luigi ;
Napoletano, Paolo .
IEEE ACCESS, 2018, 6 :64270-64277
[5]  
Caron M., 2020, P 34 C NEUR INF PROC
[6]  
Chen T, 2020, PR MACH LEARN RES, V119
[7]   Knowledge-guided Deep Reinforcement Learning for Interactive Recommendation [J].
Chen, Xiaocong ;
Huang, Chaoran ;
Yao, Lina ;
Wang, Xianzhi ;
Liu, Wei ;
Zhang, Wenjie .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[8]  
Eslami S., 2019, IEEE C COMP VIS PATT
[9]  
Falcon William, 2020, ARXIV200900104
[10]  
Ghosh A, 2017, AAAI CONF ARTIF INTE, P1919