Sparsity Enables Data and Energy Efficient Spiking Convolutional Neural Networks

被引:5
|
作者
Bhatt, Varun [1 ]
Ganguly, Udayan [1 ]
机构
[1] Indian Inst Technol, Dept Elect Engn, Mumbai, Maharashtra, India
关键词
Sparse coding; Unsupervised learning; Feature extraction; Spiking neural networks; Training data efficiency; NEURONS;
D O I
10.1007/978-3-030-01418-6_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
I n recent days, deep learning has surpassed human performance in image recognition tasks. A major issue with deep learning systems is their reliance on large datasets for optimal performance. When presented with a new task, generalizing from low amounts of data becomes highly attractive. Research has shown that human visual cortex might employ sparse coding to extract features from the images that we see, leading to efficient usage of available data. To ensure good generalization and energy efficiency, we create a multi-layer spiking convolutional neural network which performs layer-wise sparse coding for unsupervised feature extraction. It is applied on MNIST dataset where it achieves 92.3% accuracy with just 500 data samples, which is 4x less than what vanilla CNNs need for similar values, while reaching 98.1% accuracy with full dataset. Only around 7000 spikes are used per image (6x reduction in transferred bits per forward pass compared to CNNs) implying high sparsity. Thus, we show that our algorithm ensures better sparsity, leading to improved data and energy efficiency in learning, which is essential for some real-world applications.
引用
收藏
页码:263 / 272
页数:10
相关论文
共 50 条
  • [1] Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition
    Yongqiang Cao
    Yang Chen
    Deepak Khosla
    International Journal of Computer Vision, 2015, 113 : 54 - 66
  • [2] Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition
    Cao, Yongqiang
    Chen, Yang
    Khosla, Deepak
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 113 (01) : 54 - 66
  • [3] Selective Pruning of Sparsity-Supported Energy-Efficient Accelerator for Convolutional Neural Networks
    Liu, Chia-Chi
    Zhang, Xuezhi
    Wey, I-Chyn
    Teo, T. Hui
    2023 IEEE 16TH INTERNATIONAL SYMPOSIUM ON EMBEDDED MULTICORE/MANY-CORE SYSTEMS-ON-CHIP, MCSOC, 2023, : 454 - 461
  • [4] SparseTrain: Exploiting Dataflow Sparsity for Efficient Convolutional Neural Networks Training
    Dai, Pengcheng
    Yang, Jianlei
    Ye, Xucheng
    Cheng, Xingzhou
    Luo, Junyu
    Song, Linghao
    Chen, Yiran
    Zhao, Weisheng
    PROCEEDINGS OF THE 2020 57TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2020,
  • [5] Deterministic conversion rule for CNNs to efficient spiking convolutional neural networks
    Xu Yang
    Zhongxing Zhang
    Wenping Zhu
    Shuangming Yu
    Liyuan Liu
    Nanjian Wu
    Science China Information Sciences, 2020, 63
  • [6] Efficient Hardware Acceleration of Sparsely Active Convolutional Spiking Neural Networks
    Sommer, Jan
    Ozkan, M. Akif
    Keszocze, Oliver
    Teich, Juergen
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2022, 41 (11) : 3767 - 3778
  • [7] Deterministic conversion rule for CNNs to efficient spiking convolutional neural networks
    Yang, Xu
    Zhang, Zhongxing
    Zhu, Wenping
    Yu, Shuangming
    Liu, Liyuan
    Wu, Nanjian
    SCIENCE CHINA-INFORMATION SCIENCES, 2020, 63 (02)
  • [8] Deterministic conversion rule for CNNs to efficient spiking convolutional neural networks
    Xu YANG
    Zhongxing ZHANG
    Wenping ZHU
    Shuangming YU
    Liyuan LIU
    Nanjian WU
    Science China(Information Sciences), 2020, 63 (02) : 196 - 214
  • [9] Evolving Energy Efficient Convolutional Neural Networks
    Young, Steven R.
    Johnston, J. Travis
    Schuman, Catherine D.
    Devineni, Pravallika
    Kay, Bill
    Rose, Derek C.
    Parsa, Maryam
    Patton, Robert M.
    Potok, Thomas E.
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 4479 - 4485
  • [10] Neural Dynamics Pruning for Energy-Efficient Spiking Neural Networks
    Huang, Haoyu
    He, Linxuan
    Liu, Faqiang
    Zhao, Rong
    Shi, Luping
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024, 2024,