PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels

被引:3
作者
Zhang, Qian [1 ]
Zhu, Yi [1 ]
Cordeiro, Filipe R. [3 ]
Chen, Qiu [2 ]
机构
[1] Jiangsu Open Univ, Sch Informat Technol, Nanjing 210036, Peoples R China
[2] Kogakuin Univ, Grad Sch Engn, Dept Elect Engn & Elect, Tokyo 1638677, Japan
[3] Univ Fed Rural Pernambuco, Dept Comp, Visual Comp Lab, Recife, Brazil
关键词
Noisy label; Deep neural networks; Semi-supervised learning; Image datasets;
D O I
10.1016/j.patcog.2024.111284
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Large-scale image datasets frequently contain unavoidable noisy labels, resulting in overfitting in deep neural networks and declining performance. Most existing methods for learning from noisy labels operate as one-stage frameworks, where training data division and semi-supervised learning (SSL) are intertwined for optimization. Accordingly, their effectiveness is significantly influenced by the precision of the separated clean set, prior knowledge of noise, and the robustness of SSL. In this paper, we propose a progressive sample selection framework with contrastive loss for noisy labels named PSSCL. This framework operates in two stages, using robust and contrastive losses to augment the robustness of the model. Stage I focuses on identifying a small clean set through a long-term confidence detection strategy, while stage II aims to enhance performance by expanding this clean set. PSSCL demonstrates significant improvement across various benchmarks when compared with state-of-the-art methods. The code is available at https://github.com/LanXiaoPang613/PSSCL.
引用
收藏
页数:20
相关论文
共 40 条
[1]  
Berthelot D, 2019, ADV NEUR IN, V32
[2]  
Bossard L, 2014, LECT NOTES COMPUT SC, V8694, P446, DOI 10.1007/978-3-319-10599-4_29
[3]   RASNet: Renal automatic segmentation using an improved U-Net with multi-scale perception and attention unit [J].
Cao, Gaoyu ;
Sun, Zhanquan ;
Wang, Chaoli ;
Geng, Hongquan ;
Fu, Hongliang ;
Yin, Zhong ;
Pan, Minlan .
PATTERN RECOGNITION, 2024, 150
[4]   Compressing Features for Learning With Noisy Labels [J].
Chen, Yingyi ;
Hu, Shell Xu ;
Shen, Xi ;
Ai, Chunrong ;
Suykens, Johan A. K. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) :2124-2138
[5]   LongReMix: Robust learning with high confidence samples in a noisy label environment [J].
Cordeiro, Filipe R. ;
Sachdeva, Ragav ;
Belagiannis, Vasileios ;
Reid, Ian ;
Carneiro, Gustavo .
PATTERN RECOGNITION, 2023, 133
[6]  
Englesson E., 2024, P INT C LEARN REPR
[7]   Separating Noisy Samples From Tail Classes for Long-Tailed Image Classification With Label Noise [J].
Fang, Chaowei ;
Cheng, Lechao ;
Mao, Yining ;
Zhang, Dingwen ;
Fang, Yixiang ;
Li, Guanbin ;
Qi, Huiyan ;
Jiao, Licheng .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) :16036-16048
[8]   OT-Filter: An Optimal Transport Filter for Learning with Noisy Labels [J].
Feng, Chuanwen ;
Ren, Yilong ;
Xie, Xike .
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, :16164-16174
[9]  
Gui XJ, 2021, PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, P2469
[10]  
Hoi S. C., 2020, P INT C LEARN REPR, P1