High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron

被引:11
作者
Gao, Haoran [1 ]
He, Junxian [1 ]
Wang, Haibing [1 ]
Wang, Tengxiao [1 ]
Zhong, Zhengqing [1 ]
Yu, Jianyi [1 ]
Wang, Ying [2 ]
Tian, Min [1 ]
Shi, Cong [1 ]
机构
[1] Chongqing Univ, Sch Microelect & Commun Engn, Chongqing, Peoples R China
[2] Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
neuromorphic computing; spiking neural network; ANN-to-SNN conversion; deep SNNs; quantization-aware training;
D O I
10.3389/fnins.2023.1141701
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Spiking neural networks (SNNs) have attracted intensive attention due to the efficient event-driven computing paradigm. Among SNN training methods, the ANN-to-SNN conversion is usually regarded to achieve state-of-the-art recognition accuracies. However, many existing ANN-to-SNN techniques impose lengthy post-conversion steps like threshold balancing and weight renormalization, to compensate for the inherent behavioral discrepancy between artificial and spiking neurons. In addition, they require a long temporal window to encode and process as many spikes as possible to better approximate the real-valued ANN neurons, leading to a high inference latency. To overcome these challenges, we propose a calcium-gated bipolar leaky integrate and fire (Ca-LIF) spiking neuron model to better approximate the functions of the ReLU neurons widely adopted in ANNs. We also propose a quantization-aware training (QAT)-based framework leveraging an off-the-shelf QAT toolkit for easy ANN-to-SNN conversion, which directly exports the learned ANN weights to SNNs requiring no post-conversion processing. We benchmarked our method on typical deep network structures with varying time-step lengths from 8 to 128. Compared to other research, our converted SNNs reported competitively high-accuracy performance, while enjoying relatively short inference time steps.
引用
收藏
页数:11
相关论文
共 36 条
  • [1] [Anonymous], 2015, INT C MACHINE LEARN, DOI DOI 10.5555/3045118.3045167
  • [2] [Anonymous], 2022, ADV NEURAL INF PROCE, DOI DOI 10.1007/978-3-031-20083-0_3
  • [3] Bing Han, 2020, Computer Vision - ECCV 2020. 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12355), P388, DOI 10.1007/978-3-030-58607-2_23
  • [4] Learning real-world stimuli in a neural network with spike-driven synaptic dynamics
    Brader, Joseph M.
    Senn, Walter
    Fusi, Stefano
    [J]. NEURAL COMPUTATION, 2007, 19 (11) : 2881 - 2912
  • [5] Bu T., 2021, INT C LEARNING REPRE
  • [6] Deng S., 2021, INT C LEARNING REPRE
  • [7] Deng S., 2022, INT C LEARNING REPRE
  • [8] Diehl PU, 2015, IEEE IJCNN
  • [9] Ding J., 2021, P 30 INT JOINT C ART
  • [10] Benchmarking of Quantization Libraries in Popular Frameworks
    Dubhir, Tejas
    Mishra, Mayank
    Singhal, Rekha
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 3050 - 3055