Efficient training of supervised spiking neural networks via the normalized perceptron based learning rule

被引:22
作者
Xie, Xiurui [1 ]
Qu, Hong [1 ]
Liu, Guisong [1 ]
Zhang, Malu [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 610054, Peoples R China
关键词
Spiking neural networks; Temporal encoding mechanism; Supervised learning; Perceptron based learning rule; FUNCTIONAL ARCHITECTURE; RECEPTIVE-FIELDS; CLASSIFICATION; OSCILLATIONS; ALGORITHM; RESUME;
D O I
10.1016/j.neucom.2017.01.086
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The spiking neural networks (SNNs) are the third generation of artificial neural networks, which have made great achievements in the field of pattern recognition. However, the existing supervised training methods of SNNs are not efficient enough to meet the real-time requirement in most cases. To address this issue, the normalized perceptron based learning rule (NPBLR) is proposed in this paper for the supervised training of the multi-layer SNNs. Different from traditional methods, our algorithm only trains the selected misclassified time points and the target ones, employing the perceptron based neuron. Furthermore, the weight modification in our algorithm is normalized by a voltage based function, which is more efficient than the traditional time based method because the firing time is calculated by the voltage value. Superior to the traditional multi-layer algorithm ignoring the time accumulation of spikes, our algorithm defines the spiking activity of the postsynaptic neuron as the rate accumulation function of all presynaptic neurons in a specific time-frame. By these strategies, our algorithm overcomes some difficulties in the training of SNNs, e.g., the inefficient and no-fire problems. Comprehensive simulations are conducted both in single and multi-layer networks to investigate the learning performance of our algorithm, whose results demonstrate that our algorithm possesses higher learning efficiency and stronger parameter robustness than traditional algorithms. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:152 / 163
页数:12
相关论文
共 45 条
  • [1] Bache K., 2013, UCI Machine Learning Repository
  • [2] EVOLUTIONARY DESIGN OF SPIKING NEURAL NETWORKS
    Belatreche, Ammar
    Maguire, Liam P.
    Mcginnity, Martin
    Wu, Qing Xiang
    [J]. NEW MATHEMATICS AND NATURAL COMPUTATION, 2006, 2 (03) : 237 - 253
  • [3] Coherent Theta Oscillations and Reorganization of Spike Timing in the Hippocampal-Prefrontal Network upon Learning
    Benchenane, Karim
    Peyrache, Adrien
    Khamassi, Mehdi
    Tierney, Patrick L.
    Gioanni, Yves
    Battaglia, Francesco P.
    Wiener, Sidney I.
    [J]. NEURON, 2010, 66 (06) : 921 - 936
  • [4] The evidence for neural information processing with precise spike-times: A survey
    Bohte S.M.
    [J]. Natural Computing, 2004, 3 (2) : 195 - 206
  • [5] Error-backpropagation in temporally encoded networks of spiking neurons
    Bohte, SM
    Kok, JN
    La Poutré, H
    [J]. NEUROCOMPUTING, 2002, 48 : 17 - 37
  • [6] An efficient hardware implementation of a novel unary Spiking Neural Network multiplier with variable dendritic delays
    Diaz, Carlos
    Sanchez, Giovanny
    Duchen, Gonzalo
    Nakano, Mariko
    Perez, Hector
    [J]. NEUROCOMPUTING, 2016, 189 : 130 - 134
  • [7] Development of a Self-Regulating Evolving Spiking Neural Network for classification problem
    Dora, S.
    Subramanian, K.
    Suresh, S.
    Sundararajan, N.
    [J]. NEUROCOMPUTING, 2016, 171 : 1216 - 1229
  • [8] Dora Shirin, 2015, P INT JOINT C NEUR N, P1, DOI [DOI 10.1109/IJCNN.2015.7280592, 10.1109/IJCNN.2015.7280592]
  • [9] Effect of spike-timing-dependent plasticity on neural assembly computing
    Eskandari, Elahe
    Ahmadi, Arash
    Gomar, Shaghayegh
    [J]. NEUROCOMPUTING, 2016, 191 : 107 - 116
  • [10] The Chronotron: A Neuron That Learns to Fire Temporally Precise Spike Patterns
    Florian, Razvan V.
    [J]. PLOS ONE, 2012, 7 (08):