Self-paced resistance learning against overfitting on noisy labels

被引:15
|
作者
Shi, Xiaoshuan [1 ]
Guo, Zhenhua [2 ]
Li, Kang [3 ,4 ,5 ]
Liang, Yun [6 ]
Zhu, Xiaofeng [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Technol, Chengdu, Sichuan, Peoples R China
[2] Smart Transportat, Suzhou, Peoples R China
[3] Sichuan Univ, West China Hosp, West China Biomed Big Data Ctr, Chengdu, Sichuan, Peoples R China
[4] Sichuan Univ, MedX Ctr Informat, Chengdu, Sichuan, Peoples R China
[5] Shanghai Artificial Intelligence Lab, Shanghai, Peoples R China
[6] Univ Florida, J Crayton Pruitt Family Dept Biomed Engn, Gainesville, FL 32611 USA
关键词
Convolutional neural networks; Self-paced resistance; Model overfitting; Noisy labels; CLASSIFICATION;
D O I
10.1016/j.patcog.2022.109080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Noisy labels composed of correct and corrupted ones are pervasive in practice. They might significantly deteriorate the performance of convolutional neural networks (CNNs), because CNNs are easily overfit-ted on corrupted labels. To address this issue, inspired by an observation, deep neural networks might first memorize the probably correct-label data and then corrupt-label samples, we propose a novel yet simple self-paced resistance framework to resist corrupted labels, without using any clean valida-tion data. The proposed framework first utilizes the memorization effect of CNNs to learn a curricu-lum, which contains confident samples and provides meaningful supervision for other training samples. Then it adopts selected confident samples and a proposed resistance loss to update model parameters; the resistance loss tends to smooth model parameters' update or attain equivalent prediction over each class, thereby resisting model overfitting on corrupted labels. Finally, we unify these two modules into a single loss function and optimize it in an alternative learning. Extensive experiments demonstrate the significantly superior performance of the proposed framework over recent state-of-the-art methods on noisy-label data. Source codes of the proposed method are available on https://github.com/xsshi2015/ Self- paced- Resistance-Learning .(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Weighted Self-Paced Learning with Belief Functions
    Zhang, Shixing
    Han, Deqiang
    Dezert, Jean
    Yang, Yi
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 255
  • [42] Finding Age Path of Self-Paced Learning
    Gu, Bin
    Zhai, Zhou
    Li, Xiang
    Huang, Heng
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 151 - 160
  • [43] Self-Paced Contrastive Learning for Semi-supervised Medical Image Segmentation with Meta-labels
    Peng, Jizong
    Wang, Ping
    Desrosiers, Christian
    Pedersoli, Marco
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [44] A probabilistic interpretation of self-paced learning with applications to reinforcement learning
    Klink, Pascal
    Abdulsamad, Hany
    Belousov, Boris
    D'Eramo, Carlo
    Peters, Jan
    Pajarinen, Joni
    Journal of Machine Learning Research, 2021, 22
  • [45] Self-Paced AutoEncoder
    Yu, Tingzhao
    Guo, Chaoxu
    Wang, Lingfeng
    Xiang, Shiming
    Pan, Chunhong
    IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (07) : 1054 - 1058
  • [46] SELF-PACED CHEMISTRY
    HAWKES, SJ
    JOURNAL OF CHEMICAL EDUCATION, 1984, 61 (06) : 564 - 565
  • [47] SELF-PACED TRAINING
    FRIEDMAN, R
    BYTE, 1995, 20 (06): : 45 - 45
  • [48] Extreme Learning Machine for Supervised Classification with Self-paced Learning
    Li, Li
    Zhao, Kaiyi
    Li, Sicong
    Sun, Ruizhi
    Cai, Saihua
    NEURAL PROCESSING LETTERS, 2020, 52 (03) : 1723 - 1744
  • [49] A Probabilistic Interpretation of Self-Paced Learning with Applications to Reinforcement Learning
    Klink, Pascal
    Abdulsamad, Hany
    Belousov, Boris
    D'Eramo, Carlo
    Peters, Jan
    Pajarinen, Joni
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [50] Extreme Learning Machine for Supervised Classification with Self-paced Learning
    Li Li
    Kaiyi Zhao
    Sicong Li
    Ruizhi Sun
    Saihua Cai
    Neural Processing Letters, 2020, 52 : 1723 - 1744