Self-paced resistance learning against overfitting on noisy labels

被引:15
|
作者
Shi, Xiaoshuan [1 ]
Guo, Zhenhua [2 ]
Li, Kang [3 ,4 ,5 ]
Liang, Yun [6 ]
Zhu, Xiaofeng [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Technol, Chengdu, Sichuan, Peoples R China
[2] Smart Transportat, Suzhou, Peoples R China
[3] Sichuan Univ, West China Hosp, West China Biomed Big Data Ctr, Chengdu, Sichuan, Peoples R China
[4] Sichuan Univ, MedX Ctr Informat, Chengdu, Sichuan, Peoples R China
[5] Shanghai Artificial Intelligence Lab, Shanghai, Peoples R China
[6] Univ Florida, J Crayton Pruitt Family Dept Biomed Engn, Gainesville, FL 32611 USA
关键词
Convolutional neural networks; Self-paced resistance; Model overfitting; Noisy labels; CLASSIFICATION;
D O I
10.1016/j.patcog.2022.109080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Noisy labels composed of correct and corrupted ones are pervasive in practice. They might significantly deteriorate the performance of convolutional neural networks (CNNs), because CNNs are easily overfit-ted on corrupted labels. To address this issue, inspired by an observation, deep neural networks might first memorize the probably correct-label data and then corrupt-label samples, we propose a novel yet simple self-paced resistance framework to resist corrupted labels, without using any clean valida-tion data. The proposed framework first utilizes the memorization effect of CNNs to learn a curricu-lum, which contains confident samples and provides meaningful supervision for other training samples. Then it adopts selected confident samples and a proposed resistance loss to update model parameters; the resistance loss tends to smooth model parameters' update or attain equivalent prediction over each class, thereby resisting model overfitting on corrupted labels. Finally, we unify these two modules into a single loss function and optimize it in an alternative learning. Extensive experiments demonstrate the significantly superior performance of the proposed framework over recent state-of-the-art methods on noisy-label data. Source codes of the proposed method are available on https://github.com/xsshi2015/ Self- paced- Resistance-Learning .(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Self-paced deep clustering with learning loss
    Zhang, Kai
    Song, Chengyun
    Qiu, Lianpeng
    PATTERN RECOGNITION LETTERS, 2023, 171 : 8 - 14
  • [22] Self-Paced Weight Consolidation for Continual Learning
    Cong, Wei
    Cong, Yang
    Sun, Gan
    Liu, Yuyang
    Dong, Jiahua
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (04) : 2209 - 2222
  • [23] MATHEMATICA BASED PLATFORM FOR SELF-PACED LEARNING
    Zinder, Y.
    Nicorovici, N.
    Langtry, T.
    EDULEARN10: INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2010, : 323 - 330
  • [24] Contextualization of Learning Objects for Self-Paced Learning Environments
    Bodendorf, Freimut
    Goetzelt, Kai-Uwe
    PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON SYSTEMS (ICONS 2011), 2011, : 157 - 160
  • [25] Balanced Self-Paced Learning with Feature Corruption
    Ren, Yazhou
    Zhao, Peng
    Xu, Zenglin
    Yao, Dezhong
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2064 - 2071
  • [26] Self-Paced Learning for Neural Machine Translation
    Wan, Yu
    Yang, Baosong
    Wong, Derek F.
    Zhou, Yikai
    Chao, Lidia S.
    Zhang, Haibo
    Chen, Boxing
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1074 - 1080
  • [27] Self-Paced Multitask Learning with Shared Knowledge
    Murugesan, Keerthiram
    Carbonell, Jaime
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2522 - 2528
  • [28] Self-Paced Multi-Task Learning
    Li, Changsheng
    Yan, Junchi
    Wei, Fan
    Dong, Weishan
    Liu, Qingshan
    Zha, Hongyuan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2175 - 2181
  • [29] A Self-Paced Regularization Framework for Multilabel Learning
    Li, Changsheng
    Wei, Fan
    Yan, Junchi
    Zhang, Xiaoyu
    Liu, Qingshan
    Zha, Hongyuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (06) : 2660 - 2666
  • [30] Multi-Objective Self-Paced Learning
    Li, Hao
    Gong, Maoguo
    Meng, Deyu
    Miao, Qiguang
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1802 - 1808