Neurogenesis Dynamics-inspired Spiking Neural Network Training Acceleration

被引:1
|
作者
Huang, Shaoyi [1 ]
Fang, Haowen
Mahmood, Kaleel [1 ]
Lei, Bowen [2 ]
Xu, Nuo [3 ]
Lei, Bin [1 ]
Sun, Yue [3 ]
Xu, Dongkuan [4 ]
Wen, Wujie [3 ]
Ding, Caiwen [1 ]
机构
[1] Univ Connecticut, Storrs, CT 06269 USA
[2] Texas A&M Univ, College Stn, TX 77843 USA
[3] Lehigh Univ, Bethlehem, PA USA
[4] North Carolina State Univ, Raleigh, NC USA
来源
2023 60TH ACM/IEEE DESIGN AUTOMATION CONFERENCE, DAC | 2023年
基金
美国国家科学基金会;
关键词
spiking neural network; neural network pruning; sparse training; neuromorphic computing;
D O I
10.1109/DAC56929.2023.10247810
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Biologically inspired Spiking Neural Networks (SNNs) have attracted significant attention for their ability to provide extremely energy-efficient machine intelligence through event-driven operation and sparse activities. As artificial intelligence (AI) becomes ever more democratized, there is an increasing need to execute SNN models on edge devices. Existing works adopt weight pruning to reduce SNN model size and accelerate inference. However, these methods mainly focus on how to obtain a sparse model for efficient inference, rather than training efficiency. To overcome these drawbacks, in this paper, we propose a Neurogenesis Dynamics-inspired Spiking Neural Network training acceleration framework, NDSNN. Our framework is computational efficient and trains a model from scratch with dynamic sparsity without sacrificing model fidelity. Specifically, we design a new drop-and-grow strategy with decreasing number of non-zero weights, to maintain extreme high sparsity and high accuracy. We evaluate NDSNN using VGG-16 and ResNet-19 on CIFAR-10, CIFAR-100 and TinyImageNet. Experimental results show that NDSNN achieves up to 20.52% improvement in accuracy on Tiny-ImageNet using ResNet-19 (with a sparsity of 99%) as compared to other SOTA methods (e.g., Lottery Ticket Hypothesis (LTH), SET-SNN, RigL-SNN). In addition, the training cost of NDSNN is only 40.89% of the LTH training cost on ResNet-19 and 31.35% of the LTH training cost on VGG-16 on CIFAR-10.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Training multi-bit Spiking Neural Network with Virtual Neurons
    Xu, Haoran
    Gu, Zonghua
    Sun, Ruimin
    Ma, De
    NEUROCOMPUTING, 2025, 634
  • [22] Directly training temporal Spiking Neural Network with sparse surrogate gradient
    Li, Yang
    Zhao, Feifei
    Zhao, Dongcheng
    Zeng, Yi
    NEURAL NETWORKS, 2024, 179
  • [23] Skip Connections in Spiking Neural Networks: An Analysis of Their Effect on Network Training
    Benmeziane, Hadjer
    Ounnoughene, Amine Ziad
    Hamzaoui, Imane
    Bouhadjar, Younes
    2023 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS, IPDPSW, 2023, : 790 - 794
  • [24] Demonstration of Programmable Brain-Inspired Optoelectronic Neuron in Photonic Spiking Neural Network With Neural Heterogeneity
    Lee, Yun-Jhu
    On, Mehmet Berkay
    El Srouji, Luis
    Zhang, Li
    Abdelghany, Mahmoud
    Ben Yoo, S. J.
    JOURNAL OF LIGHTWAVE TECHNOLOGY, 2024, 42 (13) : 4542 - 4552
  • [25] A Memcapacitive Spiking Neural Network with Circuit Nonlinearity-aware Training
    Oshio, Reon
    Sugahara, Takuya
    Sawada, Atsushi
    Kimura, Mutsumi
    Zhang, Renyuan
    Nakashima, Yasuhiko
    IEEE SYMPOSIUM ON LOW-POWER AND HIGH-SPEED CHIPS AND SYSTEMS (2022 IEEE COOL CHIPS 25), 2022,
  • [26] A biologically inspired spiking neural network model of the auditory midbrain for sound source localisation
    Liu, Jindong
    Perez-Gonzalez, David
    Rees, Adrian
    Erwin, Harry
    Wermter, Stefan
    NEUROCOMPUTING, 2010, 74 (1-3) : 129 - 139
  • [27] Flyintel - a Platform for Robot Navigation based on a Brain-Inspired Spiking Neural Network
    Yao, Huang-Yu
    Huang, Hsuan-Pei
    Huang, Yu-Chi
    Lo, Chung-Chuan
    2019 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2019), 2019, : 219 - 220
  • [28] DeepFire: Acceleration of Convolutional Spiking Neural Network on Modern Field Programmable Gate Arrays
    Aung, Myat Thu Linn
    Qu, Chuping
    Yang, Liwei
    Luo, Tao
    Goh, Rick Siow Mong
    Wong, Weng-Fai
    2021 31ST INTERNATIONAL CONFERENCE ON FIELD-PROGRAMMABLE LOGIC AND APPLICATIONS (FPL 2021), 2021, : 28 - 32
  • [29] Brain-Inspired Spiking Neural Network for Online Unsupervised Time Series Prediction
    Chakraborty, Biswadeep
    Mukhopadhyay, Saibal
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [30] Effects of synaptic integration on the dynamics and computational performance of spiking neural network
    Xiumin Li
    Shengyuan Luo
    Fangzheng Xue
    Cognitive Neurodynamics, 2020, 14 : 347 - 357