Considerations in using recurrent neural networks to probe neural dynamics

被引:10
|
作者
Kao, Jonathan C. [1 ,2 ]
机构
[1] Univ Calif Los Angeles, Dept Elect & Comp Engn, Los Angeles, CA USA
[2] Univ Calif Los Angeles, Neurosci Program, Los Angeles, CA USA
关键词
artificial neural network; motor cortex; neural computation; neural dynamics; recurrent neural network; MOTOR CORTEX; PREMOTOR CORTEX; MOVEMENT; COMPLEXITY; NEURONS;
D O I
10.1152/jn.00467.2018
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Recurrent neural networks (RNNs) are increasingly being used to model complex cognitive and motor tasks performed by behaving animals. RNNs are trained to reproduce animal behavior while also capturing key statistics of empirically recorded neural activity. In this manner, the RNN can be viewed as an in silico circuit whose computational elements share similar motifs with the cortical area it is modeling. Furthermore, because the RNN's governing equations and parameters are fully known, they can be analyzed to propose hypotheses for how neural populations compute. In this context, we present important considerations when using RNNs to model motor behavior in a delayed reach task. First, by varying the network's nonlinear activation and rate regularization, we show that RNNs reproducing single-neuron firing rate motifs may not adequately capture important population motifs. Second, we find that even when RNNs reproduce key neurophysiological features on both the single neuron and population levels, they can do so through distinctly different dynamical mechanisms. To distinguish between these mechanisms, we show that an RNN consistent with a previously proposed dynamical mechanism is more robust to input noise. Finally, we show that these dynamics are sufficient for the RNN to generalize to tasks it was not trained on. Together, these results emphasize important considerations when using RNN models to probe neural dynamics. NEW & NOTEWORTHY Artificial neurons in a recurrent neural network (RNN) may resemble empirical single-unit activity but not adequately capture important features on the neural population level. Dynamics of RNNs can be visualized in low-dimensional projections to provide insight into the RNN's dynamical mechanism. RNNs trained in different ways may reproduce neurophysiological motifs but do so with distinctly different mechanisms. RNNs trained to only perform a delayed reach task can generalize to perform tasks where the target is switched or the target location is changed.
引用
收藏
页码:2504 / 2521
页数:18
相关论文
共 50 条
  • [41] Predictability of Wind Speed with Heights Using Recurrent Neural Networks
    Mohandes, Mohamed
    Rehman, Shafiqur
    Nuha, Hilal
    Schulze, Floris H.
    2021 4TH INTERNATIONAL SYMPOSIUM ON ADVANCED ELECTRICAL AND COMMUNICATION TECHNOLOGIES (ISAECT), 2021,
  • [42] Recurrent Neural Networks for Signature Generation
    Abu Zitar, Raed
    Nachouki, Mirna
    Hussain, Hanan
    Alzboun, Farid
    2020 13TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI 2020), 2020, : 1093 - 1097
  • [43] Linear Antisymmetric Recurrent Neural Networks
    Moe, Signe
    Remonato, Filippo
    Grotli, Esten I.
    Gravdahl, Jan Tommy
    LEARNING FOR DYNAMICS AND CONTROL, VOL 120, 2020, 120 : 170 - 178
  • [44] Deterministic neural dynamics transmitted through neural networks
    Asai, Yoshiyuki
    Guha, Apratim
    Villa, Alessandro E. P.
    NEURAL NETWORKS, 2008, 21 (06) : 799 - 809
  • [45] Combination of Convolutional Neural Networks and Recurrent Neural Networks for predicting soil properties using Vis-NIR spectroscopy
    Yang, Jiechao
    Wang, Xuelei
    Wang, Ruihua
    Wang, Huanjie
    GEODERMA, 2020, 380
  • [46] Physics-informed recurrent neural networks for linear and nonlinear flame dynamics
    Yadav, Vikas
    Casel, Mario
    Ghani, Abdulla
    PROCEEDINGS OF THE COMBUSTION INSTITUTE, 2023, 39 (02) : 1597 - 1606
  • [47] Context-free and context-sensitive dynamics in recurrent neural networks
    Bodén, M
    Wiles, J
    CONNECTION SCIENCE, 2000, 12 (3-4) : 197 - 210
  • [48] Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks
    Mastrogiuseppe, Francesca
    Ostojic, Srdjan
    NEURON, 2018, 99 (03) : 609 - +
  • [49] Firing rate dynamics in recurrent spiking neural networks with intrinsic and network heterogeneity
    Ly, Cheng
    JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2015, 39 (03) : 311 - 327
  • [50] Intelligent dynamic sliding-mode neural control using recurrent perturbation fuzzy neural networks
    Hsu, Chun-Fei
    Chang, Chun-Wei
    NEUROCOMPUTING, 2016, 173 : 734 - 743