Recurrent spiking neural network with dynamic presynaptic currents based on backpropagation

被引:13
作者
Wang, Zijian [1 ]
Zhang, Yanting [1 ]
Shi, Haibo [2 ]
Cao, Lei [3 ]
Yan, Cairong [1 ]
Xu, Guangwei [1 ]
机构
[1] Donghua Univ, Sch Comp Sci & Technol, Shanghai, Peoples R China
[2] Shanghai Univ Finance & Econ, Sch Stat & Management, Shanghai, Peoples R China
[3] Shanghai Maritime Univ, Dept Elect Engn, Shanghai 200000, Peoples R China
基金
中国国家自然科学基金; 上海市自然科学基金;
关键词
backpropagation; brain-inspired computing; brain-like computing; convolutional neural network; leaky integrate-and-fire neuron; spiking neural network; PLASTICITY; NEURONS;
D O I
10.1002/int.22772
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, spiking neural networks (SNNs), which originated from the theoretical basis of neuroscience, have attracted neuromorphic computing and brain-like computing due to their advantages, such as neural dynamics and coding mechanism, which are similar to biological neurons. SNNs have become one of the mainstream frameworks in the field of brain-like computing. However, most of the Leaky Integrate-and-Fire (LIF) neuron models currently used by SNNs based on direct training of backpropagation (BP) do not consider the changes in the recurrent connections and the dynamic strength of neuron connections over time. This study presented the LIF neuron model with recurrent connections and a method for dynamically changing the presynaptic currents. Recurrent LIF neurons have an additional cyclic connection compared with classic LIF neurons. Their postsynaptic current stimulates a change in membrane potential at the next time point. Their dynamics were more similar to the activities of biological neurons. We also proposed an efficient and flexible BP training method for recurrent LIF neurons. On the basis of the above methods, we proposed the recurrent SNN with dynamic presynaptic currents based on backpropagation (RDS-BP). We test the proposed RDS-BP on three image data sets (MNIST, Fashion-MNIST and CIFAR-10) and two text data sets (IMDB and TREC). The results showed that the performance of RDS-BP not only exceeded the naive SNN models based on BP but also exceeded the SNN methods proposed in previous studies in recent years, which had excellent performance in previous experiments. Our work provides a new LIF neuron model with a recurrent connection and dynamic presynaptic current and a BP training arrangement for the proposed neuron, which could merit developments with neuromorphic and brain-like computing.
引用
收藏
页码:2242 / 2265
页数:24
相关论文
共 48 条
  • [1] [Anonymous], 2015, IJCNN
  • [2] Bellec Guillaume, 2018, Advances in neural information processing systems
  • [3] Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynaptic cell type
    Bi, GQ
    Poo, MM
    [J]. JOURNAL OF NEUROSCIENCE, 1998, 18 (24) : 10464 - 10472
  • [4] Recurrent Spiking Neural Network Learning Based on a Competitive Maximization of Neuronal Activity
    Demin, Vyacheslav
    Nekhaev, Dmitry
    [J]. FRONTIERS IN NEUROINFORMATICS, 2018, 12
  • [5] Diehl PU, 2016, 2016 IEEE INTERNATIONAL CONFERENCE ON REBOOTING COMPUTING (ICRC)
  • [6] Eliasmith C, 2016, TRAINING SPIKING DEE
  • [7] Dynamic causal modeling and Granger causality Comments on: The identification of interacting networks in the brain using fMRI: Model selection, causality and deconvolution
    Friston, Karl J.
    [J]. NEUROIMAGE, 2011, 58 (02) : 303 - 305
  • [8] Gerstner W., 2002, SPIKING NEURON MODEL, DOI [10.1017/CBO9780511815706, DOI 10.1017/CBO9780511815706]
  • [9] The tempotron:: a neuron that learns spike timing-based decisions
    Gütig, R
    Sompolinsky, H
    [J]. NATURE NEUROSCIENCE, 2006, 9 (03) : 420 - 428
  • [10] Associative Hebbian Synaptic Plasticity in Primate Visual Cortex
    Huang, Shiyong
    Rozas, Carlos
    Trevino, Mario
    Contreras, Jessica
    Yang, Sunggu
    Song, Lihua
    Yoshioka, Takashi
    Lee, Hey-Kyoung
    Kirkwood, Alfredo
    [J]. JOURNAL OF NEUROSCIENCE, 2014, 34 (22) : 7575 - 7579