Memory in linear recurrent neural networks in continuous time

被引:43
作者
Hermans, Michiel [1 ]
Schrauwen, Benjamin [1 ]
机构
[1] Univ Ghent, Dept Elect & Informat Syst, B-9000 Ghent, Belgium
关键词
Reservoir computing; Continuous time; Memory function; Linear dynamics; Recurrent neural networks; COMPUTATION; CHAOS; EDGE;
D O I
10.1016/j.neunet.2009.08.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Reservoir Computing is a novel technique which employs recurrent neural networks while circumventing difficult training algorithms. A very recent trend in Reservoir Computing is the use of real physical dynamical systems as implementation platforms, rather than the customary digital emulations. Physical systems operate in continuous time, creating a fundamental difference with the classic discrete time definitions of Reservoir Computing. The specific goal of this paper is to study the memory properties of such systems, where we will limit ourselves to linear dynamics. We develop an analytical model which allows the calculation of the memory function for continuous time linear dynamical systems, which can be considered as networks of linear leaky integrator neurons. We then use this model to research memory properties for different types of reservoir. We start with random connection matrices with a shifted eigenvalue spectrum, which perform very poorly. Next, we transform two specific reservoir types, which are known to give good performance in discrete time, to the continuous time domain. Reservoirs based on uniform spreading of connection matrix eigenvalues on the unit disk in discrete time give much better memory properties than reservoirs with random connection matrices, where reservoirs based on orthogonal connection matrices in discrete time are very robust against noise and their memory properties can be tuned. The overall results found in this work yield important insights into how to design networks for continuous time. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:341 / 355
页数:15
相关论文
共 33 条
  • [1] [Anonymous], 2002, Principal components analysis
  • [2] [Anonymous], 1977, Solution of illposed problems
  • [3] [Anonymous], 1985, Theory of Probability & Its Applications, DOI DOI 10.1137/1129095
  • [4] Event detection and localization for small mobile robots using reservoir computing
    Antonelo, E. A.
    Schrauwen, B.
    Stroobandt, D.
    [J]. NEURAL NETWORKS, 2008, 21 (06) : 862 - 871
  • [5] Real-time computation at the edge of chaos in recurrent neural networks
    Bertschinger, N
    Natschläger, T
    [J]. NEURAL COMPUTATION, 2004, 16 (07) : 1413 - 1436
  • [6] Online entropy manipulation: Stochastic information gradient
    Erdogmus, D
    Hild, KE
    Principe, JC
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2003, 10 (08) : 242 - 245
  • [7] Fernando C, 2003, LECT NOTES ARTIF INT, V2801, P588
  • [8] Memory traces in dynamical systems
    Ganguli, Surya
    Huh, Dongsung
    Sompolinsky, Haim
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2008, 105 (48) : 18970 - 18975
  • [9] Gerstner W., 2002, SPIKING NEURON MODEL
  • [10] Hammer B., 2002, PROC EUR S ARTIF NEU, P357