Energy-Efficient Models for High-Dimensional Spike Train Classification using Sparse Spiking Neural Networks

被引:4
|
作者
Yin, Hang [1 ]
Lee, John Boaz [1 ]
Kong, Xiangnan [1 ]
Hartvigsen, Thomas [1 ]
Xie, Sihong [2 ]
机构
[1] Worcester Polytech Inst, Worcester, MA 01609 USA
[2] Lehigh Univ, Bethlehem, PA 18015 USA
来源
KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING | 2021年
关键词
spiking neural networks; supervised learning; spatio-temporal coding; sparsity; hard-concrete distribution;
D O I
10.1145/3447548.3467252
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spike train classification is an important problem in many areas such as healthcare and mobile sensing, where each spike train is a high-dimensional time series of binary values. Conventional research on spike train classification mainly focus on developing Spiking Neural Networks (SNNs) under resource-sufficient settings (e.g., on GPU servers). The neurons of the SNNs are usually densely connected in each layer. However, in many real-world applications, we often need to deploy the SNN models on resource-constrained platforms (e.g., mobile devices) to analyze high-dimensional spike train data. The high resource requirement of the densely-connected SNNs can make them hard to deploy on mobile devices. In this paper, we study the problem of energy-efficient SNNs with sparsely-connected neurons. We propose an SNN model with sparse spatio-temporal coding. Our solution is based on the re-parameterization of weights in an SNN and the application of sparsity regularization during optimization. We compare our work with the state-of-the-art SNNs and demonstrate that our sparse SNNs achieve significantly better computational efficiency on both neuromorphic and standard datasets with comparable classification accuracy. Furthermore, compared with densely-connected SNNs, we show that our method has a better capability of generalization on small-size datasets through extensive experiments.
引用
收藏
页码:2017 / 2025
页数:9
相关论文
共 50 条
  • [1] Dynamic Spike Bundling for Energy-Efficient Spiking Neural Networks
    Krithivasan, Sarada
    Sen, Sanchari
    Venkataramani, Swagath
    Raghunathan, Anand
    2019 IEEE/ACM INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN (ISLPED), 2019,
  • [2] Towards Energy-Efficient Sentiment Classification with Spiking Neural Networks
    Chen, Junhao
    Ye, Xiaojun
    Sun, Jingbo
    Li, Chao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 518 - 529
  • [3] Distributed Learning of Deep Sparse Neural Networks for High-dimensional Classification
    Garg, Shweta
    Krishnan, R.
    Jagannathan, S.
    Samaranayake, V. A.
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 1587 - 1592
  • [4] Neural Dynamics Pruning for Energy-Efficient Spiking Neural Networks
    Huang, Haoyu
    He, Linxuan
    Liu, Faqiang
    Zhao, Rong
    Shi, Luping
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024, 2024,
  • [5] BitSNNs: Revisiting Energy-Efficient Spiking Neural Networks
    Hu, Yangfan
    Zheng, Qian
    Pan, Gang
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (05) : 1736 - 1747
  • [6] AutoSNN: Towards Energy-Efficient Spiking Neural Networks
    Na, Byunggook
    Mok, Jisoo
    Park, Seongsik
    Lee, Dongjin
    Choe, Hyeokjun
    Yoon, Sungroh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [7] ACCURATE, ENERGY-EFFICIENT CLASSIFICATION WITH SPIKING RANDOM NEURAL NETWORK
    Hussain, Khaled F.
    Bassyouni, Mohamed Yousef
    Gelenbe, Erol
    PROBABILITY IN THE ENGINEERING AND INFORMATIONAL SCIENCES, 2021, 35 (01) : 51 - 61
  • [8] Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike Hybrid Input Encoding
    Datta, Gourav
    Kundu, Souvik
    Beerel, Peter A.
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [9] On the classification consistency of high-dimensional sparse neural network
    Yang, Kaixu
    Maiti, Taps
    2019 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA 2019), 2019, : 173 - 182
  • [10] TQ-TTFS: High-Accuracy and Energy-Efficient Spiking Neural Networks Using Temporal Quantization Time-to-First-Spike Neuron
    Yang, Yuxuan
    Xuan, Zihao
    Kang, Yi
    29TH ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE, ASP-DAC 2024, 2024, : 836 - 841