Improving the tolerance of multilayer perceptrons by minimizing the statistical sensitivity to weight deviations

被引:30
作者
Bernier, JL [1 ]
Ortega, J [1 ]
Rojas, I [1 ]
Prieto, A [1 ]
机构
[1] Univ Granada, Dto Arquitectura & Technol Computadores, Granada, Spain
关键词
fault tolerance; backpropagation; statistical sensitivity; weight deviations; ANOVA;
D O I
10.1016/S0925-2312(99)00150-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a version of the backpropagation algorithm which increases the tolerance of a feedforward neural network against deviations in the weight values. These changes can originate either when the neural network is mapped on a given VLSI circuit where the precision and/or weight matching are low, or by physical defects affecting the neural circuits. The modified backpropagation algorithm we propose uses the statistical sensitivity of the network to changes in the weights as a quantitative measure of network tolerance and attempts to reduce this statistical sensitivity while keeping the figures for the usual training performance (in errors and time) similar to those obtained with the usual backpropagation algorithm. (C) 2000 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:87 / 103
页数:17
相关论文
共 27 条
[1]  
ALIPPI C, 1994, P IEEE INT S CIRC SY, P459
[2]  
Bernier JL, 1997, LECT NOTES COMPUT SC, V1240, P763, DOI 10.1007/BFb0032535
[3]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[4]  
Casella G., 2021, STAT INFERENCE
[5]  
CHIU C, 1994, P IEEE INT C NEURAL, V1, P333
[6]   Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance? [J].
Edwards, PJ ;
Murray, AF .
INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 1995, 6 (04) :401-416
[7]  
EDWARDS PJ, 1998, EUR S ART NEUR NETW, P261
[8]  
ELSIMARY H, 1994, P INT C EL CIRC SYST, P566
[9]  
FISHER RA, 1936, ANN EUGEN, V9, P174
[10]   REGULARIZATION THEORY AND NEURAL NETWORKS ARCHITECTURES [J].
GIROSI, F ;
JONES, M ;
POGGIO, T .
NEURAL COMPUTATION, 1995, 7 (02) :219-269