Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation

被引:39
作者
Comsa, Iulia-Maria [1 ]
Potempa, Krzysztof [1 ,2 ]
Versari, Luca [1 ]
Fischbacher, Thomas [1 ]
Gesmundo, Andrea [1 ]
Alakuijala, Jyrki [1 ]
机构
[1] Google Res Zurich, CH-8002 Zurich, Switzerland
[2] GSA Capital, London, England
关键词
Neurons; Encoding; Biological information theory; Timing; Biological system modeling; Biological neural networks; Backpropagation; biological neural networks; feedforward neural networks; image classification; supervised learning; ERROR-BACKPROPAGATION; GRADIENT DESCENT; RATE-CODE; OSCILLATIONS; INFORMATION; INTEGRATION; INHIBITION; ALGORITHM;
D O I
10.1109/TNNLS.2021.3071976
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The timing of individual neuronal spikes is essential for biological brains to make fast responses to sensory stimuli. However, conventional artificial neural networks lack the intrinsic temporal coding ability present in biological networks. We propose a spiking neural network model that encodes information in the relative timing of individual spikes. In classification tasks, the output of the network is indicated by the first neuron to spike in the output layer. This temporal coding scheme allows the supervised training of the network with backpropagation, using locally exact derivatives of the postsynaptic spike times with respect to presynaptic spike times. The network operates using a biologically plausible synaptic transfer function. In addition, we use trainable pulses that provide bias, add flexibility during training, and exploit the decayed part of the synaptic function. We show that such networks can be successfully trained on multiple data sets encoded in time, including MNIST. Our model outperforms comparable spiking models on MNIST and achieves similar quality to fully connected conventional networks with the same architecture. The spiking network spontaneously discovers two operating modes, mirroring the accuracy-speed tradeoff observed in human decision-making: a highly accurate but slow regime, and a fast but slightly lower accuracy regime. These results demonstrate the computational power of spiking networks with biological characteristics that encode information in the timing of individual neurons. By studying temporal coding in spiking networks, we aim to create building blocks toward energy-efficient, state-based biologically inspired neural architectures. We provide open-source code for the model.
引用
收藏
页码:5939 / 5952
页数:14
相关论文
共 50 条
  • [21] An Efficient Learning Algorithm for Direct Training Deep Spiking Neural Networks
    Zhu, Xiaolei
    Zhao, Baixin
    Ma, De
    Tang, Huajin
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (03) : 847 - 856
  • [22] One-Pass Online Learning Based on Gradient Descent for Multilayer Spiking Neural Networks
    Lin, Xianghong
    Hu, Tiandou
    Wang, Xiangwen
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2023, 15 (01) : 16 - 31
  • [23] Self-backpropagation of synaptic modifications elevates the efficiency of spiking and artificial neural networks
    Zhang, Tielin
    Cheng, Xiang
    Jia, Shuncheng
    Poo, Mu-Ming
    Zeng, Yi
    Xu, Bo
    SCIENCE ADVANCES, 2021, 7 (43)
  • [24] BPSpike II: A New Backpropagation Learning Algorithm for Spiking Neural Networks
    Matsuda, Satoshi
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 56 - 65
  • [25] BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning
    Saeed Reza Kheradpisheh
    Maryam Mirsadeghi
    Timothée Masquelier
    Neural Processing Letters, 2022, 54 : 1255 - 1273
  • [26] Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding
    Gardner, Brian
    Gruning, Andre
    PLOS ONE, 2016, 11 (08):
  • [27] BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning
    Kheradpisheh, Saeed Reza
    Mirsadeghi, Maryam
    Masquelier, Timothee
    NEURAL PROCESSING LETTERS, 2022, 54 (02) : 1255 - 1273
  • [28] Training multi-layer spiking neural networks using NormAD based spatio-temporal error backpropagation
    Anwani, Navin
    Rajendran, Bipin
    NEUROCOMPUTING, 2020, 380 : 67 - 77
  • [29] Supervised learning in multilayer spiking neural networks with inner products of spike trains
    Lin, Xianghong
    Wang, Xiangwen
    Hao, Zhanjun
    NEUROCOMPUTING, 2017, 237 : 59 - 70
  • [30] LC-TTFS: Toward Lossless Network Conversion for Spiking Neural Networks With TTFS Coding
    Yang, Qu
    Zhang, Malu
    Wu, Jibin
    Tan, Kay Chen
    Li, Haizhou
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (05) : 1626 - 1639