Memory traces in dynamical systems

被引:201
作者
Ganguli, Surya [1 ,2 ]
Huh, Dongsung [3 ]
Sompolinsky, Haim [4 ,5 ]
机构
[1] Univ Calif San Francisco, Sloan Swartz Ctr Theoret Neurobiol, San Francisco, CA 94143 USA
[2] Columbia Univ, Ctr Theoret Neurosci, New York, NY 10032 USA
[3] Univ Calif San Diego, Computat Neurobiol Program, La Jolla, CA 92093 USA
[4] Hebrew Univ Jerusalem, Interdisciplinary Ctr Neural Computat, IL-91904 Jerusalem, Israel
[5] Harvard Univ, Ctr Brain Sci, Cambridge, MA 02138 USA
关键词
Fisher information; fluid mechanics; network dynamics;
D O I
10.1073/pnas.0804451105
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
To perform nontrivial, real-time computations on a sensory input stream, biological systems must retain a short-term memory trace of their recent inputs. It has been proposed that generic high-dimensional dynamical systems could retain a memory trace for past inputs in their current state. This raises important questions about the fundamental limits of such memory traces and the properties required of dynamical systems to achieve these limits. We address these issues by applying Fisher information theory to dynamical systems driven by time-dependent signals corrupted by noise. We introduce the Fisher Memory Curve (FMC) as a measure of the signal-to-noise ratio (SNR) embedded in the dynamical state relative to the input SNR. The integrated FMC indicates the total memory capacity. We apply this theory to linear neuronal networks and show that the capacity of networks with normal connectivity matrices is exactly 1 and that of any network of N neurons is, at most, N. A nonnormal network achieving this bound is subject to stringent design constraints: It must have a hidden feedforward architecture that superlinearly amplifies its input for a time of order N, and the input connectivity must optimally match this architecture. The memory capacity of networks subject to saturating nonlinearities is further limited, and cannot exceed root N This limit can be realized by feedforward structures with divergent fan out that distributes the signal across neurons, thereby avoiding saturation. We illustrate the generality of the theory by showing that memory in fluid systems can be sustained by transient nonnormal amplification due to convective instability or the onset of turbulence.
引用
收藏
页码:18970 / 18975
页数:6
相关论文
共 12 条
[1]   Global measures of local convective instabilities [J].
Cossu, C ;
Chomaz, JM .
PHYSICAL REVIEW LETTERS, 1997, 78 (23) :4387-4390
[2]  
FERNANDO C, 2003, P ECAL
[3]   Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication [J].
Jaeger, H ;
Haas, H .
SCIENCE, 2004, 304 (5667) :78-80
[4]  
Jaeger H., 2001, 148 GMD GERM NAT RES
[5]   Temporal integration by calcium dynamics in a model neuron [J].
Loewenstein, Y ;
Sompolinsky, H .
NATURE NEUROSCIENCE, 2003, 6 (09) :961-967
[6]   Real-time computing without stable states:: A new framework for neural computation based on perturbations [J].
Maass, W ;
Natschläger, T ;
Markram, H .
NEURAL COMPUTATION, 2002, 14 (11) :2531-2560
[7]   Synaptic theory of working memory [J].
Mongillo, Gianluigi ;
Barak, Omri ;
Tsodyks, Misha .
SCIENCE, 2008, 319 (5869) :1543-1546
[8]   How the brain keeps the eyes still [J].
Seung, HS .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 1996, 93 (23) :13339-13344
[9]   SIMPLE-MODELS FOR READING NEURONAL POPULATION CODES [J].
SEUNG, HS ;
SOMPOLINSKY, H .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 1993, 90 (22) :10749-10753
[10]   HYDRODYNAMIC STABILITY WITHOUT EIGENVALUES [J].
TREFETHEN, LN ;
TREFETHEN, AE ;
REDDY, SC ;
DRISCOLL, TA .
SCIENCE, 1993, 261 (5121) :578-584