The annealing robust backpropagation (ARBP) learning algorithm

被引:128
作者
Chuang, CC [1 ]
Su, SF [1 ]
Hsiao, CC [1 ]
机构
[1] Natl Taiwan Univ Sci & Technol, Dept Elect Engn, Taipei 10772, Taiwan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2000年 / 11卷 / 05期
关键词
annealing schedule; outliers; robust learning algorithm;
D O I
10.1109/72.870040
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multilayer feedforward neural networks are often referred to as universal approximators, Nevertheless, if the used training data are corrupted by large noise, such as outliers, traditional backpropagation learning schemes may not always come up with acceptable performance. Even though various robust learning algorithms have been proposed in the literature, those approaches still suffer from the initialization problem. In those robust learning algorithms, the so-called M-estimator is employed. For the hi-estimation type of learning algorithms, the loss function is used to play the role in discriminating against outliers from the majority by degrading the effects of those outliers in learning. However, the loss function used in those algorithms may not correctly discriminate against those outliers, In this paper, the annealing robust backpropagation learning algorithm (ARBP) that adopts the annealing concept into the robust learning algorithms is proposed to deal with the problem of modeling under the existence of outliers, The proposed algorithm has been employed in various examples. Those results all demonstrated the superiority over other robust learning algorithms independent of outliers, In the paper not only is the annealing concept adopted into the robust learning algorithms but also the annealing schedule kit mas found experimentally to achieve the best performance among other annealing schedules, where k is a constant and t is the epoch number.
引用
收藏
页码:1067 / 1077
页数:11
相关论文
共 26 条
  • [1] [Anonymous], 1992, NEURAL NETWORKS FUZZ
  • [2] Bartlett PL, 1997, ADV NEUR IN, V9, P134
  • [3] Carpenter G., 1991, Pattern recognition by self-organizing neural networks
  • [4] A ROBUST BACK-PROPAGATION LEARNING ALGORITHM FOR FUNCTION APPROXIMATION
    CHEN, DS
    JAIN, RC
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (03): : 467 - 479
  • [5] CHEN GY, 1998, THESIS NAT TAIWAN U
  • [6] CHENG Y, 1998, THESIS NAT TAIWAN U
  • [7] Cichocki A., 1993, Neural Networks for Optimization and Signal Processing
  • [8] RECURRENT NEURAL NETWORKS AND ROBUST TIME-SERIES PREDICTION
    CONNOR, JT
    MARTIN, RD
    ATLAS, LE
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02): : 240 - 254
  • [9] Hawkins D.M, 1980, IDENTIFICATION OUTLI, V11, DOI [10.1007/978-94-015-3994-4, DOI 10.1007/978-94-015-3994-4]
  • [10] USING ADDITIVE NOISE IN BACK-PROPAGATION TRAINING
    HOLMSTROM, L
    KOISTINEN, P
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (01): : 24 - 38