Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation

被引:39
|
作者
Comsa, Iulia-Maria [1 ]
Potempa, Krzysztof [1 ,2 ]
Versari, Luca [1 ]
Fischbacher, Thomas [1 ]
Gesmundo, Andrea [1 ]
Alakuijala, Jyrki [1 ]
机构
[1] Google Res Zurich, CH-8002 Zurich, Switzerland
[2] GSA Capital, London, England
关键词
Neurons; Encoding; Biological information theory; Timing; Biological system modeling; Biological neural networks; Backpropagation; biological neural networks; feedforward neural networks; image classification; supervised learning; ERROR-BACKPROPAGATION; GRADIENT DESCENT; RATE-CODE; OSCILLATIONS; INFORMATION; INTEGRATION; INHIBITION; ALGORITHM;
D O I
10.1109/TNNLS.2021.3071976
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The timing of individual neuronal spikes is essential for biological brains to make fast responses to sensory stimuli. However, conventional artificial neural networks lack the intrinsic temporal coding ability present in biological networks. We propose a spiking neural network model that encodes information in the relative timing of individual spikes. In classification tasks, the output of the network is indicated by the first neuron to spike in the output layer. This temporal coding scheme allows the supervised training of the network with backpropagation, using locally exact derivatives of the postsynaptic spike times with respect to presynaptic spike times. The network operates using a biologically plausible synaptic transfer function. In addition, we use trainable pulses that provide bias, add flexibility during training, and exploit the decayed part of the synaptic function. We show that such networks can be successfully trained on multiple data sets encoded in time, including MNIST. Our model outperforms comparable spiking models on MNIST and achieves similar quality to fully connected conventional networks with the same architecture. The spiking network spontaneously discovers two operating modes, mirroring the accuracy-speed tradeoff observed in human decision-making: a highly accurate but slow regime, and a fast but slightly lower accuracy regime. These results demonstrate the computational power of spiking networks with biological characteristics that encode information in the timing of individual neurons. By studying temporal coding in spiking networks, we aim to create building blocks toward energy-efficient, state-based biologically inspired neural architectures. We provide open-source code for the model.
引用
收藏
页码:5939 / 5952
页数:14
相关论文
共 50 条
  • [1] TEMPORAL CODING IN SPIKING NEURAL NETWORKS WITH ALPHA SYNAPTIC FUNCTION
    Comsa, Julia M.
    Potempa, Krzysztof
    Versari, Luca
    Fischbacher, Thomas
    Gesmundo, Andrea
    Alakuijala, Jyrki
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8529 - 8533
  • [2] Supervised Learning in Multilayer Spiking Neural Networks With Spike Temporal Error Backpropagation
    Luo, Xiaoling
    Qu, Hong
    Wang, Yuchen
    Yi, Zhang
    Zhang, Jilun
    Zhang, Malu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 10141 - 10153
  • [3] Efficient Spiking Neural Networks With Logarithmic Temporal Coding
    Zhang, Ming
    Gu, Zonghua
    Zheng, Nenggan
    Ma, De
    Pan, Gang
    IEEE ACCESS, 2020, 8 : 98156 - 98167
  • [4] Supervised Learning in Spiking Neural Networks with Synaptic Delay Plasticity: An Overview
    Lan, Yawen
    Li, Qiang
    CURRENT BIOINFORMATICS, 2020, 15 (08) : 854 - 865
  • [5] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] Delay learning based on temporal coding in Spiking Neural Networks
    Sun, Pengfei
    Wu, Jibin
    Zhang, Malu
    Devos, Paul
    Botteldooren, Dick
    NEURAL NETWORKS, 2024, 180
  • [7] Supervised Learning Based on Temporal Coding in Spiking Neural Networks
    Mostafa, Hesham
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) : 3227 - 3235
  • [8] Backpropagation-Based Learning Techniques for Deep Spiking Neural Networks: A Survey
    Dampfhoffer, Manon
    Mesquida, Thomas
    Valentian, Alexandre
    Anghel, Lorena
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11906 - 11921
  • [9] A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design
    Sakemi, Yusuke
    Morino, Kai
    Morie, Takashi
    Aihara, Kazuyuki
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (01) : 394 - 408
  • [10] Training Spiking Neural Networks for Cognitive Tasks: A Versatile Framework Compatible With Various Temporal Codes
    Hong, Chaofei
    Wei, Xile
    Wang, Jiang
    Deng, Bin
    Yu, Haitao
    Che, Yanqiu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (04) : 1285 - 1296