A quantitative study of fault tolerance, noise immunity, and generalization ability of MLPs

被引:49
作者
Bernier, JL [1 ]
Ortega, J [1 ]
Ros, E [1 ]
Rojas, I [1 ]
Prieto, A [1 ]
机构
[1] Univ Granada, Dept Arquitectura & Tecnol Computadores, E-18071 Granada, Spain
关键词
D O I
10.1162/089976600300014782
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An analysis of the influence of weight and input perturbations in a multilayer perceptron (MLP) is made in this article. Quantitative measurements of fault tolerance, noise immunity, and generalization ability are provided. From the expressions obtained, it is possible to justify some previously reported conjectures and experimentally obtained results (e.g., the influence of weight magnitudes, the relation between training with noise and the generalization ability, the relation between fault tolerance and the generalization ability). The measurements introduced here are explicitly related to the mean squared error degradation in the presence of perturbations, thus constituting a selection criterion between different alternatives of weight configurations. Moreover, they allow us to predict the degradation of the learning performance of an MLP when its weights or inputs are deviated from their nominal values and thus, the behavior of a physical implementation can be evaluated before the weights are mapped on it according to its accuracy.
引用
收藏
页码:2941 / 2964
页数:24
相关论文
共 24 条
[1]  
ALIPPI C, 1994, P IEEE INT S CIRC SY, P459
[2]  
Bernier J L, 1999, Int J Neural Syst, V9, P511, DOI 10.1142/S0129065799000551
[3]   Improving the tolerance of multilayer perceptrons by minimizing the statistical sensitivity to weight deviations [J].
Bernier, JL ;
Ortega, J ;
Rojas, I ;
Prieto, A .
NEUROCOMPUTING, 2000, 31 (1-4) :87-103
[4]   An accurate measure for multilayer perceptron tolerance to weight deviations [J].
Bernier, JL ;
Ortega, J ;
Rodríguez, MM ;
Rojas, I ;
Prieto, A .
NEURAL PROCESSING LETTERS, 1999, 10 (02) :121-130
[5]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[6]   TRAINING WITH NOISE IS EQUIVALENT TO TIKHONOV REGULARIZATION [J].
BISHOP, CM .
NEURAL COMPUTATION, 1995, 7 (01) :108-116
[7]  
CHIU CT, 1994, 1994 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOL 1-7, P333, DOI 10.1109/ICNN.1994.374185
[8]   Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance? [J].
Edwards, PJ ;
Murray, AF .
INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 1995, 6 (04) :401-416
[9]   Fault tolerance via weight noise in analog VLSI implementations of MLP's - A case study with EPSILON [J].
Edwards, PJ ;
Murray, AF .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING, 1998, 45 (09) :1255-1262
[10]   Toward optimally distributed computation [J].
Edwards, PJ ;
Murray, AF .
NEURAL COMPUTATION, 1998, 10 (04) :987-1005