Learning Deep Networks from Noisy Labels with Dropout Regularization

被引:0
作者
Jindal, Ishan [1 ]
Nokleby, Matthew [1 ]
Chen, Xuewen [2 ]
机构
[1] Wayne State Univ, Elect & Comp Engn, Detroit, MI 48202 USA
[2] Wayne State Univ, Dept Comp Sci, Detroit, MI 48202 USA
来源
2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM) | 2016年
基金
美国国家科学基金会;
关键词
Supervised Learning; Deep Learning; Convolutional Neural Networks; Label Noise; Dropout Regularization;
D O I
10.1109/ICDM.2016.124
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Large datasets often have unreliable labels-such as those obtained from Amazon's Mechanical Turk or social media platforms-and classifiers trained on mislabeled datasets often exhibit poor performance. We present a simple, effective technique for accounting for label noise when training deep neural networks. We augment a standard deep network with a softmax layer that models the label noise statistics. Then, we train the deep network and noise model jointly via end-to-end stochastic gradient descent on the (perhaps mislabeled) dataset. The augmented model is underdetermined, so in order to encourage the learning of a non-trivial noise model, we apply dropout regularization to the weights of the noise model during training. Numerical experiments on noisy versions of the CIFAR-10 and MNIST datasets show that the proposed dropout technique outperforms state-of-the-art methods.
引用
收藏
页码:967 / 972
页数:6
相关论文
共 50 条
[31]   Teacher/Student Deep Semi-Supervised Learning for Training with Noisy Labels [J].
Hailat, Zeyad ;
Chen, Xue-Wen .
2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, :907-912
[32]   Learning With Noisy Labels by Semantic and Feature Space Collaboration [J].
Lin, Han ;
Li, Yingjian ;
Zhang, Zheng ;
Zhu, Lei ;
Xu, Yong .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (08) :7190-7201
[33]   Recycling: Semi-Supervised Learning With Noisy Labels in Deep Neural works [J].
Kong, Kyeongbo ;
Lee, Junggi ;
Kwak, Youngchul ;
Kang, Minsung ;
Kim, Seong Gyun ;
Song, Woo-Jin .
IEEE ACCESS, 2019, 7 :66998-67005
[34]   Deep Learning from Noisy Labels via Robust Nonnegative Matrix Factorization-Based Design [J].
Wolnick, Daniel Grey ;
Ibrahim, Shahana ;
Marrinan, Tim ;
Fu, Xiao .
2023 IEEE 9TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, CAMSAP, 2023, :446-450
[35]   Partial Label Learning with Noisy Labels [J].
Zhao, Pan ;
Tang, Long ;
Pan, Zhigeng .
Annals of Data Science, 2025, 12 (01) :199-212
[36]   To Aggregate or Not? Learning with Separate Noisy Labels [J].
Wei, Jiaheng ;
Zhu, Zhaowei ;
Luo, Tianyi ;
Amid, Ehsan ;
Kumar, Abhishek ;
Liu, Yang .
PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, :2523-2535
[37]   Learning from Noisy Complementary Labels with Robust Loss Functions [J].
Ishiguro, Hiroki ;
Ishida, Takashi ;
Sugiyama, Masashi .
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) :364-376
[38]   Learning From Noisy Labels via Dynamic Loss Thresholding [J].
Yang, Hao ;
Jin, You-Zhi ;
Li, Zi-Yin ;
Wang, Deng-Bao ;
Geng, Xin ;
Zhang, Min-Ling .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) :6503-6516
[39]   Deep Learning Convolutional Neural Networks with Dropout - a Parallel Approach [J].
Shen, Jingyi ;
Shafiq, M. Omair .
2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, :572-577
[40]   DCBT-Net: Training Deep Convolutional Neural Networks With Extremely Noisy Labels [J].
Olimov, Bekhzod ;
Kim, Jeonghong ;
Paul, Anand .
IEEE ACCESS, 2020, 8 :220482-220495