Spiking Neural Networks Trained via Proxy

被引:8
作者
Kheradpisheh, Saeed Reza [1 ]
Mirsadeghi, Maryam [2 ]
Masquelier, Timothee [3 ]
机构
[1] Shahid Beheshti Univ, Fac Math Sci, Dept Comp & Data Sci, Tehran 1983969411, Iran
[2] Amirkabir Univ Technol, Dept Elect Engn, Tehran 158754413, Iran
[3] Univ Toulouse 3, Ctr Natl Rech Sci, CerCo UMR 5549, F-31062 Toulouse, France
关键词
Neurons; Artificial neural networks; Encoding; Backpropagation; Biological neural networks; Learning (artificial intelligence); Firing; Spiking neural networks; supervised learning; proxy learning; rate coding; ERROR-BACKPROPAGATION;
D O I
10.1109/ACCESS.2022.3187033
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose a new learning algorithm to train spiking neural networks (SNN) using conventional artificial neural networks (ANN) as proxy. We couple two SNN and ANN networks, respectively, made of integrate-and-fire (IF) and ReLU neurons with the same network architectures and shared synaptic weights. The forward passes of the two networks are totally independent. By assuming IF neuron with rate-coding as an approximation of ReLU, we backpropagate the error of the SNN in the proxy ANN to update the shared weights, simply by replacing the ANN final output with that of the SNN. We applied the proposed proxy learning to deep convolutional SNNs and evaluated it on two benchmarked datasets of Fashion-MNIST and Cifar10 with 94.56% and 93.11% classification accuracy, respectively. The proposed networks could outperform other deep SNNs trained with tandem learning, surrogate gradient learning, or converted from deep ANNs. Converted SNNs require long simulation times to reach reasonable accuracies while our proxy learning leads to efficient SNNs with much smaller simulation times. The source codes of the proposed method are publicly available at https://github.com/SRKH/ProxyLearning.
引用
收藏
页码:70769 / 70778
页数:10
相关论文
共 52 条
  • [1] Bellec G, 2018, 32 C NEURAL INFORM P
  • [2] Bohte SM, 2011, LECT NOTES COMPUT SC, V6791, P60, DOI 10.1007/978-3-642-21735-7_8
  • [3] Error-backpropagation in temporally encoded networks of spiking neurons
    Bohte, SM
    Kok, JN
    La Poutré, H
    [J]. NEUROCOMPUTING, 2002, 48 : 17 - 37
  • [4] Spiking Neural Networks Hardware Implementations and Challenges: A Survey
    Bouvier, Maxence
    Valentian, Alexandre
    Mesquida, Thomas
    Rummens, Francois
    Reyboz, Marina
    Vianello, Elisa
    Beigne, Edith
    [J]. ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2019, 15 (02)
  • [5] Cheng X, 2020, PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P1519
  • [6] Deng S, 2021, PROC INT C LEARN REP, P1
  • [7] Ding J, 2021, ARXIV
  • [8] Eshraghian J. K., 2021, arXiv
  • [9] Convolutional networks for fast, energy-efficient neuromorphic computing
    Esser, Steven K.
    Merolla, Paul A.
    Arthur, John V.
    Cassidy, Andrew S.
    Appuswamy, Rathinakumar
    Andreopoulos, Alexander
    Berg, David J.
    McKinstry, Jeffrey L.
    Melano, Timothy
    Barch, Davis R.
    di Nolfo, Carmelo
    Datta, Pallab
    Amir, Arnon
    Taba, Brian
    Flickner, Myron D.
    Modha, Dharmendra S.
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2016, 113 (41) : 11441 - 11446
  • [10] Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks
    Fang, Wei
    Yu, Zhaofei
    Chen, Yanqi
    Masquelier, Timothee
    Huang, Tiejun
    Tian, Yonghong
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 2641 - 2651