An analog VLSI recurrent neural network learning a continuous-time trajectory

被引:88
作者
Cauwenberghs, G [1 ]
机构
[1] CALTECH,DEPT ELECT ENGN,PASADENA,CA 91125
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1996年 / 7卷 / 02期
关键词
D O I
10.1109/72.485671
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Real-time algorithms for gradient descent supervised learning in recurrent dynamical neural networks fail to support scalable VLSI (very large scale integration) implementation, due to their complexity which grows sharply with the network dimension. We present an alternative implementation in analog VLSI, which employs a stochastic perturbative algorithm to observe the gradient of the error index directly on the network in random directions of the parameter space, thereby avoiding the tedious task of deriving the gradient from an explicit model of the network dynamics. The network contains six fully recurrent neurons with continuous-time dynamics, providing 42 free parameters which comprise connection strengths and thresholds. The chip implementing the network includes local provisions supporting both the learning and storage of the parameters, integrated in a scalable architecture which can be readily expanded for applications of learning recurrent dynamical networks requiring larger dimensionality. We describe and characterize the functional elements comprising the implemented recurrent network and integrated learning system, and include experimental results obtained from training the network to produce two outputs following a circular trajectory, representing a quadrature-phase oscillator.
引用
收藏
页码:346 / 361
页数:16
相关论文
共 47 条
[1]   A VLSI-EFFICIENT TECHNIQUE FOR GENERATING MULTIPLE UNCORRELATED NOISE SOURCES AND ITS APPLICATION TO STOCHASTIC NEURAL NETWORKS [J].
ALSPECTOR, J ;
GANNETT, JW ;
HABER, S ;
PARKER, MB ;
CHU, R .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1991, 38 (01) :109-123
[2]  
ALSPECTOR J, 1989, ADV NEURAL INFORMATI, V1, P748
[3]   CURRENT-MODE SUBTHRESHOLD MOS CIRCUITS FOR ANALOG VLSI NEURAL SYSTEMS [J].
ANDREOU, AG ;
BOAHEN, KA ;
POULIQUEN, PO ;
PAVASOVIC, A ;
JENKINS, RE ;
STROHBEHN, K .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (02) :205-213
[4]  
[Anonymous], 1993, Advances in neural information processing systems
[5]  
[Anonymous], ADV NEURAL INFORM PR
[6]   GRADIENT DESCENT LEARNING ALGORITHM OVERVIEW - A GENERAL DYNAMICAL-SYSTEMS PERSPECTIVE [J].
BALDI, P .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (01) :182-195
[7]  
BALDI P, 1992, LEARNING DYNAMICAL S
[8]   UV-ACTIVATED CONDUCTANCES ALLOW FOR MULTIPLE TIME-SCALE LEARNING [J].
BENSON, RG ;
KERNS, DA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1993, 4 (03) :434-440
[9]  
BIBYK S, 1989, ANALOG VLSI IMPLEMEN, P103
[10]   ANALYSIS AND VERIFICATION OF AN ANALOG VLSI INCREMENTAL OUTER-PRODUCT LEARNING-SYSTEM [J].
CAUWENBERGHS, G ;
NEUGEBAUER, CF ;
YARIV, A .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (03) :488-497