An Efficient non-Backpropagation Method for Training Spiking Neural Networks

被引:3
|
作者
Guo, Shiqi [1 ]
Lin, Tong [2 ,3 ]
机构
[1] Peking Univ, Ctr Data Sci, Beijing, Peoples R China
[2] Peking Univ, Sch EECS, Key Lab Machine Percept MOE, Beijing, Peoples R China
[3] Peng Cheng Lab, Shenzhen, Peoples R China
关键词
spiking neural networks; non-backpropagation; membrane potential; information representation; DESIGN;
D O I
10.1109/ICTAI52525.2021.00034
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) have recently attracted significant research interest and have been regarded as the next generation of artificial neural networks due to their suitability for energy-efficient event-driven neuromorphic computing. However, the existing SNNs error backpropagation (BP) method may have severe difficulties in non-differentiable spiking generation functions and vanishing or exploding gradients. In this paper, we introduce an efficient method for training SNNs without backpropagation. The information bottleneck (IB) principle is leveraged to learn synaptic weights and neuron thresholds of an SNN. The membrane potential state for information representation is learned in real-time for higher time and space efficiency compared with the conventional BP method. Experimental results show that the proposed biologically plausible method achieves comparable accuracy and considerable steps/memory reduction in training SNN on MNIST/FashionMNIST datasets.
引用
收藏
页码:192 / 199
页数:8
相关论文
共 50 条
  • [1] Training Deep Spiking Neural Networks Using Backpropagation
    Lee, Jun Haeng
    Delbruck, Tobi
    Pfeiffer, Michael
    FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [2] Memristor Circuits for Non-Backpropagation Training Algorithm
    Oh, Seokjin
    Yoon, Rina
    Cho, Seungmyeong
    Min, Kyeong-Sik
    2023 20TH INTERNATIONAL SOC DESIGN CONFERENCE, ISOCC, 2023, : 201 - 201
  • [3] Efficient training of backpropagation neural networks
    Otair, Mohammed A.
    Salameh, Walid A.
    NEURAL NETWORK WORLD, 2006, 16 (04) : 291 - 311
  • [4] Towards Memory- and Time-Efficient Backpropagation for Training Spiking Neural Networks
    Meng, Qingyan
    Xiao, Mingqing
    Yan, Shen
    Wang, Yisen
    Lin, Zhouchen
    Luo, Zhi-Quan
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6143 - 6153
  • [5] Training Spiking Neural Networks with Event-driven Backpropagation
    Zhu, Yaoyu
    Yu, Zhaofei
    Fang, Wei
    Xie, Xiaodong
    Huang, Tiejun
    Masquelier, Timothee
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [6] Efficient training of spiking neural networks with temporally-truncated local backpropagation through time
    Guo, Wenzhe
    Fouda, Mohammed E.
    Eltawil, Ahmed M.
    Salama, Khaled Nabil
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [7] Spiking Neural Networks Using Backpropagation
    Syed, Tehreem
    Kakani, Vijay
    Cui, Xuenan
    Kim, Hakil
    2021 IEEE REGION 10 SYMPOSIUM (TENSYMP), 2021,
  • [8] Dynamic layer-span connecting spiking neural networks with backpropagation training
    Zijjian Wang
    Yuxuan Huang
    Yaqin Zhu
    Binxing Xu
    Long Chen
    Complex & Intelligent Systems, 2024, 10 : 1937 - 1952
  • [9] Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks
    Jin, Yingyezhe
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [10] Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks
    Shen, Guobin
    Zhao, Dongcheng
    Zeng, Yi
    PATTERNS, 2022, 3 (06):