An Efficient non-Backpropagation Method for Training Spiking Neural Networks

被引:2
|
作者
Guo, Shiqi [1 ]
Lin, Tong [2 ,3 ]
机构
[1] Peking Univ, Ctr Data Sci, Beijing, Peoples R China
[2] Peking Univ, Sch EECS, Key Lab Machine Percept MOE, Beijing, Peoples R China
[3] Peng Cheng Lab, Shenzhen, Peoples R China
关键词
spiking neural networks; non-backpropagation; membrane potential; information representation; DESIGN;
D O I
10.1109/ICTAI52525.2021.00034
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) have recently attracted significant research interest and have been regarded as the next generation of artificial neural networks due to their suitability for energy-efficient event-driven neuromorphic computing. However, the existing SNNs error backpropagation (BP) method may have severe difficulties in non-differentiable spiking generation functions and vanishing or exploding gradients. In this paper, we introduce an efficient method for training SNNs without backpropagation. The information bottleneck (IB) principle is leveraged to learn synaptic weights and neuron thresholds of an SNN. The membrane potential state for information representation is learned in real-time for higher time and space efficiency compared with the conventional BP method. Experimental results show that the proposed biologically plausible method achieves comparable accuracy and considerable steps/memory reduction in training SNN on MNIST/FashionMNIST datasets.
引用
收藏
页码:192 / 199
页数:8
相关论文
共 50 条
  • [21] Backpropagation for population-temporal coded spiking neural networks
    Schrauwen, Benjamin
    Van Campenhout, Jan
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 1797 - +
  • [22] A Modified Gradient-based Backpropagation Training Method for Neural Networks
    Mu, Xuewen
    Zhang, Yaling
    2009 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING ( GRC 2009), 2009, : 450 - +
  • [23] An Efficient Learning Algorithm for Direct Training Deep Spiking Neural Networks
    Zhu, Xiaolei
    Zhao, Baixin
    Ma, De
    Tang, Huajin
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (03) : 847 - 856
  • [24] Training Spiking Neural Networks with Accumulated Spiking Flow
    Wu, Hao
    Zhang, Yueyi
    Weng, Wenming
    Zhang, Yongting
    Xiong, Zhiwei
    Zha, Zheng-Jun
    Sun, Xiaoyan
    Wu, Feng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10320 - 10328
  • [25] AN ADAPTIVE TRAINING ALGORITHM FOR BACKPROPAGATION NEURAL NETWORKS
    HSIN, HC
    LI, CC
    SUN, MG
    SCLABASSI, RJ
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1995, 25 (03): : 512 - 514
  • [26] A stochastic backpropagation algorithm for training neural networks
    Chen, YQ
    Yin, T
    Babri, HA
    ICICS - PROCEEDINGS OF 1997 INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATIONS AND SIGNAL PROCESSING, VOLS 1-3: THEME: TRENDS IN INFORMATION SYSTEMS ENGINEERING AND WIRELESS MULTIMEDIA COMMUNICATIONS, 1997, : 703 - 707
  • [27] IMPROVEMENT OF THE BACKPROPAGATION ALGORITHM FOR TRAINING NEURAL NETWORKS
    LEONARD, J
    KRAMER, MA
    COMPUTERS & CHEMICAL ENGINEERING, 1990, 14 (03) : 337 - 341
  • [28] Training Delays in Spiking Neural Networks
    State, Laura
    Aceituno, Pau Vilimelis
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: THEORETICAL NEURAL COMPUTATION, PT I, 2019, 11727 : 713 - 717
  • [29] Chip-In-Loop SNN Proxy Learning: a new method for efficient training of spiking neural networks
    Liu, Yuhang
    Liu, Tingyu
    Hu, Yalun
    Liao, Wei
    Xing, Yannan
    Sheik, Sadique
    Qiao, Ning
    FRONTIERS IN NEUROSCIENCE, 2024, 17
  • [30] Temporal Backpropagation for Spiking Neural Networks with One Spike per Neuron
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2020, 30 (06)