Error-Based Noise Filtering During Neural Network Training

被引:2
作者
Alharbi, Fahad [1 ]
Hindi, Khalil El [1 ]
Al-Ahmadi, Saad [1 ]
机构
[1] King Saud Univ, Dept Comp Sci, Coll Comp & Informat Sci, Riyadh 11543, Saudi Arabia
关键词
Licenses; Neural networks; convolutional neural networks; noisy data; semi-supervised learning; CLASSIFICATION;
D O I
10.1109/ACCESS.2020.3019465
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The problem of dealing with noisy data in neural network-based models has been receiving more attention by researchers with the aim of mitigating possible consequences on learning. Several methods have been applied by some researchers to enhance data as a pre-process of training while other researchers have attempted to make models of learning aware of noise and thus able to deal with noisy instances. We propose a simple and efficient method that we call Error-Based Filtering (EBF) that is used during training as a filtration technique for supervised learning in neural network-based models. EBF is independent of the model architecture and can therefore be involved in any neural network-based model. Our approach is based on monitoring and analyzing the distribution of values of the loss (error) function for each instance during training. In addition, EBF can be integrated with semi-supervised learning to take advantage of the identified noisy instances and improve classification. An advantage of EBF is to achieve competitive performance compared with other state-of-the-art methods with many fewer additional tasks in a procedure of training. Our evaluation of the efficacy of our method on three well-known benchmark datasets demonstrates an improvement on classification accuracy in the presence of noise.
引用
收藏
页码:156996 / 157004
页数:9
相关论文
共 36 条
  • [1] [Anonymous], 2015, ARXIV PREPRINT ARXIV
  • [2] [Anonymous], **DATA OBJECT**
  • [3] Arpit D, 2017, PR MACH LEARN RES, V70
  • [4] Identifying mislabeled training data
    Brodley, CE
    Friedl, MA
    [J]. JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 1999, 11 : 131 - 167
  • [5] Choh Man Teng, 2001, Proceedings of the Fourteenth International Florida Artificial Intelligence Research Society Conference, P269
  • [6] El Hindi K., 2009, PROCEEDING INTELLIGE, P93
  • [7] SMOOTHING DECISION BOUNDARIES TO AVOID OVERFITTING IN NEURAL NETWORK TRAINING
    el Hindi, Khalil
    AL-Akhras, Mousa
    [J]. NEURAL NETWORK WORLD, 2011, 21 (04) : 311 - 325
  • [8] Classification in the Presence of Label Noise: a Survey
    Frenay, Benoit
    Verleysen, Michel
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (05) : 845 - 869
  • [9] Gold JR, 2017, PLAN HIST ENVIRON SE, P1
  • [10] Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
    Han, Bo
    Yao, Quanming
    Yu, Xingrui
    Niu, Gang
    Xu, Miao
    Hu, Weihua
    Tsang, Ivor W.
    Sugiyama, Masashi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31