Teaching recurrent neural networks to infer global temporal structure from local examples

被引:83
作者
Kim, Jason Z. [1 ]
Lu, Zhixin [1 ]
Nozari, Erfan [2 ,3 ]
Pappas, George J. [2 ]
Bassett, Danielle S. [1 ,2 ,4 ,5 ,6 ,7 ]
机构
[1] Univ Penn, Dept Bioengn, Philadelphia, PA 19104 USA
[2] Univ Penn, Dept Elect & Syst Engn, Philadelphia, PA 19104 USA
[3] Univ Calif Riverside, Dept Mech Engn, Riverside, CA 92521 USA
[4] Univ Penn, Dept Phys & Astron, Philadelphia, PA 19104 USA
[5] Univ Penn, Dept Neurol, Philadelphia, PA 19104 USA
[6] Univ Penn, Dept Psychiat, Philadelphia, PA 19104 USA
[7] Santa Fe Inst, Santa Fe, NM 87501 USA
基金
美国国家科学基金会;
关键词
GENDERED CITATION PATTERNS; NEURONS; MODEL;
D O I
10.1038/s42256-021-00321-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The ability to store and manipulate information is a hallmark of computational systems. Whereas computers are carefully engineered to represent and perform mathematical operations on structured data, neurobiological systems adapt to perform analogous functions without needing to be explicitly engineered. Recent efforts have made progress in modelling the representation and recall of information in neural systems. However, precisely how neural systems learn to modify these representations remains far from understood. Here, we demonstrate that a recurrent neural network (RNN) can learn to modify its representation of complex information using only examples, and we explain the associated learning mechanism with new theory. Specifically, we drive an RNN with examples of translated, linearly transformed or pre-bifurcated time series from a chaotic Lorenz system, alongside an additional control signal that changes value for each example. By training the network to replicate the Lorenz inputs, it learns to autonomously evolve about a Lorenz-shaped manifold. Additionally, it learns to continuously interpolate and extrapolate the translation, transformation and bifurcation of this representation far beyond the training data by changing the control signal. Furthermore, we demonstrate that RNNs can infer the bifurcation structure of normal forms and period doubling routes to chaos, and extrapolate non-dynamical, kinematic trajectories. Finally, we provide a mechanism for how these computations are learned, and replicate our main results using a Wilson-Cowan reservoir. Together, our results provide a simple but powerful mechanism by which an RNN can learn to manipulate internal representations of complex information, enabling the principled study and precise design of RNNs.
引用
收藏
页码:316 / 323
页数:8
相关论文
共 40 条
  • [1] Temporal-Kernel Recurrent Neural Networks
    Sutskever, Ilya
    Hinton, Geoffrey
    NEURAL NETWORKS, 2010, 23 (02) : 239 - 243
  • [2] Local Dynamics in Trained Recurrent Neural Networks
    Rivkind, Alexander
    Barak, Omri
    PHYSICAL REVIEW LETTERS, 2017, 118 (25)
  • [3] The Correlation Structure of Local Neuronal Networks Intrinsically Results from Recurrent Dynamics
    Helias, Moritz
    Tetzlaff, Tom
    Diesmann, Markus
    PLOS COMPUTATIONAL BIOLOGY, 2014, 10 (01)
  • [4] Temporal Dependent Local Learning for Deep Spiking Neural Networks
    Ma, Chenxiang
    Xu, Junhai
    Yu, Qiang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [5] Spatial-temporal Prediction of Air Quality based on Recurrent Neural Networks
    Sun, Xiaotong
    Xu, Wei
    Jiang, Hongxun
    PROCEEDINGS OF THE 52ND ANNUAL HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES, 2019, : 1265 - 1274
  • [6] Global exponential stability for quaternion-valued recurrent neural networks with time-varying delays
    Liu, Yang
    Zhang, Dandan
    Lu, Jianquan
    NONLINEAR DYNAMICS, 2017, 87 (01) : 553 - 565
  • [7] Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks
    Shao, Yuxiu
    Ostojic, Srdjan
    PLOS COMPUTATIONAL BIOLOGY, 2023, 19 (01)
  • [8] Global stability of Clifford-valued recurrent neural networks with time delays
    Liu, Yang
    Xu, Pei
    Lu, Jianquan
    Liang, Jinling
    NONLINEAR DYNAMICS, 2016, 84 (02) : 767 - 777
  • [9] Reconstructing computational system dynamics from neural data with recurrent neural networks
    Durstewitz, Daniel
    Koppe, Georgia
    Thurm, Max Ingo
    NATURE REVIEWS NEUROSCIENCE, 2023, 24 (11) : 693 - 710
  • [10] Global Asymptotic Stability for Matrix-Valued Recurrent Neural Networks with Time Delays
    Popa, Calin-Adrian
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 4474 - 4481