Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation

被引:39
作者
Comsa, Iulia-Maria [1 ]
Potempa, Krzysztof [1 ,2 ]
Versari, Luca [1 ]
Fischbacher, Thomas [1 ]
Gesmundo, Andrea [1 ]
Alakuijala, Jyrki [1 ]
机构
[1] Google Res Zurich, CH-8002 Zurich, Switzerland
[2] GSA Capital, London, England
关键词
Neurons; Encoding; Biological information theory; Timing; Biological system modeling; Biological neural networks; Backpropagation; biological neural networks; feedforward neural networks; image classification; supervised learning; ERROR-BACKPROPAGATION; GRADIENT DESCENT; RATE-CODE; OSCILLATIONS; INFORMATION; INTEGRATION; INHIBITION; ALGORITHM;
D O I
10.1109/TNNLS.2021.3071976
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The timing of individual neuronal spikes is essential for biological brains to make fast responses to sensory stimuli. However, conventional artificial neural networks lack the intrinsic temporal coding ability present in biological networks. We propose a spiking neural network model that encodes information in the relative timing of individual spikes. In classification tasks, the output of the network is indicated by the first neuron to spike in the output layer. This temporal coding scheme allows the supervised training of the network with backpropagation, using locally exact derivatives of the postsynaptic spike times with respect to presynaptic spike times. The network operates using a biologically plausible synaptic transfer function. In addition, we use trainable pulses that provide bias, add flexibility during training, and exploit the decayed part of the synaptic function. We show that such networks can be successfully trained on multiple data sets encoded in time, including MNIST. Our model outperforms comparable spiking models on MNIST and achieves similar quality to fully connected conventional networks with the same architecture. The spiking network spontaneously discovers two operating modes, mirroring the accuracy-speed tradeoff observed in human decision-making: a highly accurate but slow regime, and a fast but slightly lower accuracy regime. These results demonstrate the computational power of spiking networks with biological characteristics that encode information in the timing of individual neurons. By studying temporal coding in spiking networks, we aim to create building blocks toward energy-efficient, state-based biologically inspired neural architectures. We provide open-source code for the model.
引用
收藏
页码:5939 / 5952
页数:14
相关论文
共 50 条
  • [41] Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
    Yan, Yulong
    Chu, Haoming
    Jin, Yi
    Huan, Yuxiang
    Zou, Zhuo
    Zheng, Lirong
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [42] ASP: Learning to Forget With Adaptive Synaptic Plasticity in Spiking Neural Networks
    Panda, Priyadarshini
    Allred, Jason M.
    Ramanathan, Shriram
    Roy, Kaushik
    IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2018, 8 (01) : 51 - 64
  • [43] SpikeBASE: Spiking Neural Learning Algorithm With Backward Adaptation of Synaptic Efflux
    Stauffer, Jake
    Zhang, Qingxue
    IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (11) : 2707 - 2716
  • [44] Progressive Tandem Learning for Pattern Recognition With Deep Spiking Neural Networks
    Wu, Jibin
    Xu, Chenglin
    Han, Xiao
    Zhou, Daquan
    Zhang, Malu
    Li, Haizhou
    Tan, Kay Chen
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (11) : 7824 - 7840
  • [45] Backpropagation Learning Algorithm for Multi layer Phasor Neural Networks
    Tanaka, Gouhei
    Aihara, Kazuyuki
    NEURAL INFORMATION PROCESSING, PT 1, PROCEEDINGS, 2009, 5863 : 484 - 493
  • [46] Neuromorphic Temporal Pattern Detection with Spiking Neural Networks using Synaptic Delays
    Bulzomi, Hugo
    Nakano, Yuta
    Bendahan, Remy
    Martinet, Jean
    2024 IEEE 6TH INTERNATIONAL CONFERENCE ON AI CIRCUITS AND SYSTEMS, AICAS 2024, 2024, : 46 - 50
  • [47] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [48] Temporal Dependent Local Learning for Deep Spiking Neural Networks
    Ma, Chenxiang
    Xu, Junhai
    Yu, Qiang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [49] Effective Transfer Learning Algorithm in Spiking Neural Networks
    Zhan, Qiugang
    Liu, Guisong
    Xie, Xiurui
    Sun, Guolin
    Tang, Huajin
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13323 - 13335
  • [50] A review of learning in biologically plausible spiking neural networks
    Taherkhani, Aboozar
    Belatreche, Ammar
    Li, Yuhua
    Cosma, Georgina
    Maguire, Liam P.
    McGinnity, T. M.
    NEURAL NETWORKS, 2020, 122 : 253 - 272