On the overtraining phenomenon of backpropagation neural networks

被引:25
作者
Tzafestas, SG
Dalianis, PJ
Anthopoulos, G
机构
[1] Intelligent Robotics and Contr. Unit, Computer Science Division, Natl. Technical University of Athens, Zografou 15773, Athens
关键词
Algorithms - Approximation theory - Backpropagation - Character recognition - Computer simulation - Electric network topology - Learning systems - Performance;
D O I
10.1016/0378-4754(95)00003-8
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
A very important subject for the consolidation of neural networks is the study of their capabilities. In this paper, the relationships between network size, training set size and generalization capabilities are examined. The phenomenon of overtraining in backpropagation networks is discussed and an extension to an existing algorithm is described. The extended algorithm provides a new energy function and its advantages, such as improved plasticity and performance along with its dynamic properties, are explained. The algorithm is applied to some common problems (XOR, numeric character recognition and function approximation) and simulation results are presented and discussed.
引用
收藏
页码:507 / 521
页数:15
相关论文
共 11 条
[1]  
BAUM EB, 1990, LECT NOTES COMPUTER, V412
[2]  
CHAUVIN Y, 1989, ADV NEURAL INFORMATI
[3]  
CHAUVIN Y, 1990, LECT NOTES COMPUTER, V412
[4]  
Hecht-Nielsen R., 1989, IJCNN: International Joint Conference on Neural Networks (Cat. No.89CH2765-6), P593, DOI 10.1109/IJCNN.1989.118638
[5]  
Hecht-Nielsen R, 1990, NEUROCOMPUTING
[6]  
MCINERNEY JM, 1988, CAN BACKPROPAGATION
[7]  
RUMELHART DE, 1989, PARALLEL DISTRIBUTED, V1
[8]   LEARNING ALGORITHMS FOR NEURAL NETWORKS WITH THE KALMAN FILTERS [J].
WATANABE, K ;
TZAFESTAS, SG .
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 1990, 3 (04) :305-319
[9]  
WATANABE K, 1991, INT S MICRO, V9, P205
[10]  
WATANABE K, 1992, MATH ANAL DESIGN PRO, P525