Jo-SRC: A Contrastive Approach for Combating Noisy Labels

被引:122
作者
Yao, Yazhou [1 ]
Sun, Zeren [1 ]
Zhang, Chuanyi [1 ]
Shen, Fumin [2 ]
Wu, Qi [3 ]
Zhang, Jian [4 ]
Tang, Zhenmin [1 ]
机构
[1] Nanjing Univ Sci & Technol, Nanjing, Peoples R China
[2] Univ Elect Sci & Technol China, Chengdu, Peoples R China
[3] Univ Adelaide, Adelaide, SA, Australia
[4] Univ Technol Sydney, Sydney, NSW, Australia
来源
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021 | 2021年
基金
中国国家自然科学基金;
关键词
D O I
10.1109/CVPR46437.2021.00515
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Due to the memorization effect in Deep Neural Networks (DNNs), training with noisy labels usually results in inferior model performance. Existing state-of-the-art methods primarily adopt a sample selection strategy, which selects small-loss samples for subsequent training. However, prior literature tends to perform sample selection within each mini-batch, neglecting the imbalance of noise ratios in different mini-batches. Moreover, valuable knowledge within high-loss samples is wasted. To this end, we propose a noise-robust approach named Jo-SRC (Joint Sample Selection and Model Regularization based on Consistency). Specifically, we train the network in a contrastive learning manner. Predictions from two different views of each sample are used to estimate its "likelihood" of being clean or out-of-distribution. Furthermore, we propose a joint loss to advance the model generalization performance by introducing consistency regularization. Extensive experiments have validated the superiority of our approach over existing stateof-the-art methods.
引用
收藏
页码:5188 / 5197
页数:10
相关论文
共 54 条
[1]  
[Anonymous], 2020, AAAI
[2]  
[Anonymous], 2019, CVPR, DOI DOI 10.1109/CVPR.2019.00755
[3]  
[Anonymous], 2017, IEEE TMM, DOI DOI 10.1109/TMM.2017.2684626
[4]  
[Anonymous], 2017, CVPR, DOI DOI 10.1109/CVPR.2017.240
[5]  
[Anonymous], 2019, CVPR, DOI DOI 10.1109/CVPR.2019.00519
[6]  
[Anonymous], 2018, INT C LEARN REPR, DOI DOI 10.1002/PROT.25414
[7]  
[Anonymous], 2019, ICML
[8]  
[Anonymous], 2018, CVPR, DOI DOI 10.1109/CVPR.2018.00582
[9]  
[Anonymous], 2018, CVPR, DOI DOI 10.1109/CVPR.2018.00571
[10]  
[Anonymous], 2019, ICML