Self-paced resistance learning against overfitting on noisy labels

被引:15
|
作者
Shi, Xiaoshuan [1 ]
Guo, Zhenhua [2 ]
Li, Kang [3 ,4 ,5 ]
Liang, Yun [6 ]
Zhu, Xiaofeng [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Technol, Chengdu, Sichuan, Peoples R China
[2] Smart Transportat, Suzhou, Peoples R China
[3] Sichuan Univ, West China Hosp, West China Biomed Big Data Ctr, Chengdu, Sichuan, Peoples R China
[4] Sichuan Univ, MedX Ctr Informat, Chengdu, Sichuan, Peoples R China
[5] Shanghai Artificial Intelligence Lab, Shanghai, Peoples R China
[6] Univ Florida, J Crayton Pruitt Family Dept Biomed Engn, Gainesville, FL 32611 USA
关键词
Convolutional neural networks; Self-paced resistance; Model overfitting; Noisy labels; CLASSIFICATION;
D O I
10.1016/j.patcog.2022.109080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Noisy labels composed of correct and corrupted ones are pervasive in practice. They might significantly deteriorate the performance of convolutional neural networks (CNNs), because CNNs are easily overfit-ted on corrupted labels. To address this issue, inspired by an observation, deep neural networks might first memorize the probably correct-label data and then corrupt-label samples, we propose a novel yet simple self-paced resistance framework to resist corrupted labels, without using any clean valida-tion data. The proposed framework first utilizes the memorization effect of CNNs to learn a curricu-lum, which contains confident samples and provides meaningful supervision for other training samples. Then it adopts selected confident samples and a proposed resistance loss to update model parameters; the resistance loss tends to smooth model parameters' update or attain equivalent prediction over each class, thereby resisting model overfitting on corrupted labels. Finally, we unify these two modules into a single loss function and optimize it in an alternative learning. Extensive experiments demonstrate the significantly superior performance of the proposed framework over recent state-of-the-art methods on noisy-label data. Source codes of the proposed method are available on https://github.com/xsshi2015/ Self- paced- Resistance-Learning .(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Self-Paced Robust Learning for Leveraging Clean Labels in Noisy Data
    Zhang, Xuchao
    Wu, Xian
    Chen, Fanglan
    Zhao, Liang
    Lu, Chang-Tien
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6853 - 6860
  • [2] Robust Semi-Supervised Classification for Noisy Labels Based on Self-Paced Learning
    Gu, Nannan
    Fan, Mingyu
    Meng, Deyu
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (12) : 1806 - 1810
  • [3] On the effectiveness of self-paced learning
    Tullis, Jonathan G.
    Benjamin, Aaron S.
    JOURNAL OF MEMORY AND LANGUAGE, 2011, 64 (02) : 109 - 118
  • [4] Self-Paced Curriculum Learning
    Jiang, Lu
    Meng, Deyu
    Zhao, Qian
    Shan, Shiguang
    Hauptmann, Alexander G.
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2694 - 2700
  • [5] Self-Paced Learning with Diversity
    Jiang, Lu
    Meng, Deyu
    Yu, Shoou-, I
    Lan, Zhenzhong
    Shan, Shiguang
    Hauptmann, Alexander G.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [6] Weakly supervised vessel segmentation in X-ray angiograms by self-paced learning from noisy labels with suggestive annotation
    Zhang, Jingyang
    Wang, Guotai
    Xie, Hongzhi
    Zhang, Shuyang
    Huang, Ning
    Zhang, Shaoting
    Gu, Lixu
    NEUROCOMPUTING, 2020, 417 : 114 - 127
  • [7] PACED AND SELF-PACED LEARNING IN YOUNG AND ELDERLY ADULTS
    CANESTRARI, RE
    JOURNALS OF GERONTOLOGY, 1963, 18 (02): : 165 - 168
  • [8] Self-paced learning with privileged information
    Xu, Wei
    Liu, Wei
    Chi, Haoyuan
    Qiu, Song
    Jin, Yu
    NEUROCOMPUTING, 2019, 362 : 147 - 155
  • [9] Self-Paced Learning for Matrix Factorization
    Zhao, Qian
    Meng, Deyu
    Jiang, Lu
    Xie, Qi
    Xu, Zongben
    Hauptmann, Alexander G.
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 3196 - 3202
  • [10] Self-Paced Broad Learning System
    Liu, Licheng
    Cai, Luyang
    Xie, Ting
    Wang, Yaonan
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (06) : 4029 - 4042