NOISE INJECTION INTO INPUTS IN BACKPROPAGATION LEARNING

被引:190
作者
MATSUOKA, K
机构
[1] Division of Control Engineering, Kyushu Institute of Technology, Tobata
来源
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS | 1992年 / 22卷 / 03期
关键词
8;
D O I
10.1109/21.155944
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Back-propagation can be considered a nonlinear regression technique, allowing a nonlinear neural network to acquire an input/output (I/O) association using a limited number of samples chosen from a population of input and output patterns. A crucial problem on back-propagation is its generalization capability. A network successfully trained for given samples is not guaranteed to provide desired associations for untrained inputs as well. Concerning this problem some authors showed experimentally that the generalization capability could remarkably be enhanced by training the network with noise injected inputs. This paper mathematically explains why and how the noise injection to inputs has such an effect.
引用
收藏
页码:436 / 440
页数:5
相关论文
共 8 条
  • [1] ERUSU E, 1988, ORG NEURAL NETWORKS, P369
  • [2] HANSON S., 1989, ADV NEURAL INFORM PR, P177
  • [3] ISHIKAWA M, 1988, IEICE MBE88144 TECH, P143
  • [4] KOHONE T, 1987, SELF ORG ASS MEMORY
  • [5] KUNG SY, 1988, IEEE INT C NEURAL NE, V1, P363
  • [6] PLAUT DC, 1986, CMUCS86126 TECH REP
  • [7] Rumelhart David E., 1987, LEARNING INTERNAL RE, P318
  • [8] Sietsma J., 1988, IEEE International Conference on Neural Networks (IEEE Cat. No.88CH2632-8), P325, DOI 10.1109/ICNN.1988.23864