Memory in linear recurrent neural networks in continuous time

被引:44
作者
Hermans, Michiel [1 ]
Schrauwen, Benjamin [1 ]
机构
[1] Univ Ghent, Dept Elect & Informat Syst, B-9000 Ghent, Belgium
关键词
Reservoir computing; Continuous time; Memory function; Linear dynamics; Recurrent neural networks; COMPUTATION; CHAOS; EDGE;
D O I
10.1016/j.neunet.2009.08.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Reservoir Computing is a novel technique which employs recurrent neural networks while circumventing difficult training algorithms. A very recent trend in Reservoir Computing is the use of real physical dynamical systems as implementation platforms, rather than the customary digital emulations. Physical systems operate in continuous time, creating a fundamental difference with the classic discrete time definitions of Reservoir Computing. The specific goal of this paper is to study the memory properties of such systems, where we will limit ourselves to linear dynamics. We develop an analytical model which allows the calculation of the memory function for continuous time linear dynamical systems, which can be considered as networks of linear leaky integrator neurons. We then use this model to research memory properties for different types of reservoir. We start with random connection matrices with a shifted eigenvalue spectrum, which perform very poorly. Next, we transform two specific reservoir types, which are known to give good performance in discrete time, to the continuous time domain. Reservoirs based on uniform spreading of connection matrix eigenvalues on the unit disk in discrete time give much better memory properties than reservoirs with random connection matrices, where reservoirs based on orthogonal connection matrices in discrete time are very robust against noise and their memory properties can be tuned. The overall results found in this work yield important insights into how to design networks for continuous time. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:341 / 355
页数:15
相关论文
共 33 条
[11]  
HERMANS M, 2008, P 19 ANN WORKSH CIRC, P328
[12]   NEURAL NETWORKS AND PHYSICAL SYSTEMS WITH EMERGENT COLLECTIVE COMPUTATIONAL ABILITIES [J].
HOPFIELD, JJ .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA-BIOLOGICAL SCIENCES, 1982, 79 (08) :2554-2558
[13]   Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication [J].
Jaeger, H ;
Haas, H .
SCIENCE, 2004, 304 (5667) :78-80
[14]  
Jaeger H., 2001, 148 GMD GERM NAT RES
[15]  
Jaeger H., 2001, Short term memory in echo state networks
[16]   Optimization and applications of echo state networks with leaky-integrator neurons [J].
Jaegera, Herbert ;
Lukosevicius, Mantas ;
Popovici, Dan ;
Siewert, Udo .
NEURAL NETWORKS, 2007, 20 (03) :335-352
[17]  
Jones B, 2006, IEEE SYMP ART LIFE, P187
[18]  
Jury E. I., 1964, THEORY APPL Z TRANSF
[19]   Edge of chaos and prediction of computational performance for neural circuit models [J].
Legenstein, Robert ;
Maass, Wolfgang .
NEURAL NETWORKS, 2007, 20 (03) :323-334
[20]   Real-time computing without stable states:: A new framework for neural computation based on perturbations [J].
Maass, W ;
Natschläger, T ;
Markram, H .
NEURAL COMPUTATION, 2002, 14 (11) :2531-2560