TOLERANCE TO ANALOG HARDWARE OF ON-CHIP LEARNING IN BACKPROPAGATION NETWORKS

被引:47
作者
DOLENKO, BK [1 ]
CARD, HC [1 ]
机构
[1] UNIV MANITOBA,WINNIPEG,MB R3T 2N2,CANADA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1995年 / 6卷 / 05期
基金
加拿大自然科学与工程研究理事会;
关键词
D O I
10.1109/72.410349
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When training an artificial neural network using the popular backpropagation algorithm, implementation in dedicated analog hardware offers an attractive alternative for reasons of speed, compactness, and the lack of limited resolution in computation that is present in digital hardware, Despite these advantages, until now the use of analog hardware in backpropagation networks has been primarily limited to the forward computation, The learning computations are usually performed off-chip, because it is not immediately evident that the nonidealities inherent in analog hardware allow adequate learning to take place, In this paper we present results of simulations performed assuming both forward and backward computation are done on-chip using analog components, Aspects of analog hardware studied are component variability (variability in multiplier gains and zero offsets),limited voltage ranges, components (multipliers) that only approximate the computations in the backpropagation algorithm, and capacitive weight decay, It is shown that backpropagation networks can learn to compensate for all these shortcomings of analog circuits except for zero offsets, and the latter are correctable with minor circuit complications, Variability in multiplier gains is not a problem, and learning is still possible despite limited voltage ranges and function approximations, Fixed component variation from fabrication is shown to be less detrimental to learning than component variation due to noise, Weight decay is tolerable provided it is sufficiently:small, which implies frequent refreshing by rehearsal on the training data or modest cooling of the circuits, The former approach allows for learning nonstationary problem sets.
引用
收藏
页码:1045 / 1052
页数:8
相关论文
共 21 条
[1]  
ARIMA Y, 1992, IEEE J SOLID STATE C, V26, P607
[2]  
CHOI J, 1992, 1992 P INT JOINT C N, V2, P637
[3]  
DOLENKO BK, 1993, 1993 P IEEE INT C NE, V1, P110
[4]  
FAHLMAN SE, 1988, CMUCS88162 CARN MELL
[5]   BACK-PROPAGATION LEARNING AND NONIDEALITIES IN ANALOG NEURAL NETWORK HARDWARE [J].
FRYE, RC ;
RIETMAN, EA ;
WONG, CC .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (01) :110-117
[6]   HIGH-PERFORMANCE MONOLITHIC MULTIPLIER USING ACTIVE FEEDBACK [J].
GILBERT, B .
IEEE JOURNAL OF SOLID-STATE CIRCUITS, 1974, SC 9 (06) :364-373
[7]   IMPLEMENTATION OF A LEARNING KOHONEN NEURON BASED ON A NEW MULTILEVEL STORAGE TECHNIQUE [J].
HOCHET, B ;
PEIRIS, V ;
ABDO, S ;
DECLERCQ, MJ .
IEEE JOURNAL OF SOLID-STATE CIRCUITS, 1991, 26 (03) :262-267
[8]   The Effects of Precision Constraints in a Backpropagation Learning Network [J].
Hollis, Paul W. ;
Harper, John S. ;
Paulos, John J. .
NEURAL COMPUTATION, 1990, 2 (03) :363-373
[9]  
HOLT JL, 1991, 1991 P INT JOINT C N, P1519
[10]  
HOLT JL, 1991, 1991 P INT JOINT C N