Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation

被引:39
作者
Comsa, Iulia-Maria [1 ]
Potempa, Krzysztof [1 ,2 ]
Versari, Luca [1 ]
Fischbacher, Thomas [1 ]
Gesmundo, Andrea [1 ]
Alakuijala, Jyrki [1 ]
机构
[1] Google Res Zurich, CH-8002 Zurich, Switzerland
[2] GSA Capital, London, England
关键词
Neurons; Encoding; Biological information theory; Timing; Biological system modeling; Biological neural networks; Backpropagation; biological neural networks; feedforward neural networks; image classification; supervised learning; ERROR-BACKPROPAGATION; GRADIENT DESCENT; RATE-CODE; OSCILLATIONS; INFORMATION; INTEGRATION; INHIBITION; ALGORITHM;
D O I
10.1109/TNNLS.2021.3071976
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The timing of individual neuronal spikes is essential for biological brains to make fast responses to sensory stimuli. However, conventional artificial neural networks lack the intrinsic temporal coding ability present in biological networks. We propose a spiking neural network model that encodes information in the relative timing of individual spikes. In classification tasks, the output of the network is indicated by the first neuron to spike in the output layer. This temporal coding scheme allows the supervised training of the network with backpropagation, using locally exact derivatives of the postsynaptic spike times with respect to presynaptic spike times. The network operates using a biologically plausible synaptic transfer function. In addition, we use trainable pulses that provide bias, add flexibility during training, and exploit the decayed part of the synaptic function. We show that such networks can be successfully trained on multiple data sets encoded in time, including MNIST. Our model outperforms comparable spiking models on MNIST and achieves similar quality to fully connected conventional networks with the same architecture. The spiking network spontaneously discovers two operating modes, mirroring the accuracy-speed tradeoff observed in human decision-making: a highly accurate but slow regime, and a fast but slightly lower accuracy regime. These results demonstrate the computational power of spiking networks with biological characteristics that encode information in the timing of individual neurons. By studying temporal coding in spiking networks, we aim to create building blocks toward energy-efficient, state-based biologically inspired neural architectures. We provide open-source code for the model.
引用
收藏
页码:5939 / 5952
页数:14
相关论文
共 50 条
  • [31] A Spatial-Channel-Temporal-Fused Attention for Spiking Neural Networks
    Cai, Wuque
    Sun, Hongze
    Liu, Rui
    Cui, Yan
    Wang, Jun
    Xia, Yang
    Yao, Dezhong
    Guo, Daqing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 14315 - 14329
  • [32] Supervised learning in spiking neural networks with synaptic delay-weight plasticity
    Zhang, Malu
    Wu, Jibin
    Belatreche, Ammar
    Pan, Zihan
    Xie, Xiurui
    Chua, Yansong
    Li, Guoqi
    Qu, Hong
    Li, Haizhou
    NEUROCOMPUTING, 2020, 409 : 103 - 118
  • [33] Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks
    Wu, Yujie
    Deng, Lei
    Li, Guoqi
    Zhu, Jun
    Shi, Luping
    FRONTIERS IN NEUROSCIENCE, 2018, 12
  • [34] On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices
    Kwon, Dongseok
    Lim, Suhwan
    Bae, Jong-Ho
    Lee, Sung-Tae
    Kim, Hyeongsu
    Seo, Young-Tak
    Oh, Seongbin
    Kim, Jangsaeng
    Yeom, Kyuho
    Park, Byung-Gook
    Lee, Jong-Ho
    FRONTIERS IN NEUROSCIENCE, 2020, 14
  • [35] Supervised learning in spiking neural networks: A review of algorithms and evaluations
    Wang, Xiangwen
    Lin, Xianghong
    Dang, Xiaochao
    NEURAL NETWORKS, 2020, 125 : 258 - 280
  • [36] Learning rules in spiking neural networks: A survey
    Yi, Zexiang
    Lian, Jing
    Liu, Qidong
    Zhu, Hegui
    Liang, Dong
    Liu, Jizhao
    NEUROCOMPUTING, 2023, 531 : 163 - 179
  • [37] ATC: Approximate Temporal Coding for Efficient Implementations of Spiking Neural Networks
    Han, Ming
    Wang, Ye
    Dong, Jian
    Liu, Heng
    Wu, Jin
    Qu, Gang
    PROCEEDINGS OF THE GREAT LAKES SYMPOSIUM ON VLSI 2023, GLSVLSI 2023, 2023, : 527 - 532
  • [38] Training Deep Spiking Neural Networks Using Backpropagation
    Lee, Jun Haeng
    Delbruck, Tobi
    Pfeiffer, Michael
    FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [39] Optimal Heterogeneity for Coding in Spiking Neural Networks
    Mejias, J. F.
    Longtin, A.
    PHYSICAL REVIEW LETTERS, 2012, 108 (21)
  • [40] Fast Learning in Spiking Neural Networks by Learning Rate Adaptation
    Fang Huijuan
    Luo Jiliang
    Wang Fei
    CHINESE JOURNAL OF CHEMICAL ENGINEERING, 2012, 20 (06) : 1219 - 1224