Sparsity Enables Data and Energy Efficient Spiking Convolutional Neural Networks

被引:5
作者
Bhatt, Varun [1 ]
Ganguly, Udayan [1 ]
机构
[1] Indian Inst Technol, Dept Elect Engn, Mumbai, Maharashtra, India
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT I | 2018年 / 11139卷
关键词
Sparse coding; Unsupervised learning; Feature extraction; Spiking neural networks; Training data efficiency; NEURONS;
D O I
10.1007/978-3-030-01418-6_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
I n recent days, deep learning has surpassed human performance in image recognition tasks. A major issue with deep learning systems is their reliance on large datasets for optimal performance. When presented with a new task, generalizing from low amounts of data becomes highly attractive. Research has shown that human visual cortex might employ sparse coding to extract features from the images that we see, leading to efficient usage of available data. To ensure good generalization and energy efficiency, we create a multi-layer spiking convolutional neural network which performs layer-wise sparse coding for unsupervised feature extraction. It is applied on MNIST dataset where it achieves 92.3% accuracy with just 500 data samples, which is 4x less than what vanilla CNNs need for similar values, while reaching 98.1% accuracy with full dataset. Only around 7000 spikes are used per image (6x reduction in transferred bits per forward pass compared to CNNs) implying high sparsity. Thus, we show that our algorithm ensures better sparsity, leading to improved data and energy efficiency in learning, which is essential for some real-world applications.
引用
收藏
页码:263 / 272
页数:10
相关论文
共 13 条
[1]  
[Anonymous], ABS161101421 CORR
[2]   Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynaptic cell type [J].
Bi, GQ ;
Poo, MM .
JOURNAL OF NEUROSCIENCE, 1998, 18 (24) :10464-10472
[3]   Unsupervised learning of digit recognition using spike-timing-dependent plasticity [J].
Diehl, Peter U. ;
Cook, Matthew .
FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2015, 9
[4]   Unsupervised Feature Learning With Winner-Takes-All Based STDP [J].
Ferre, Paul ;
Mamalet, Franck ;
Thorpe, Simon J. .
FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2018, 12
[5]   Deep learning [J].
LeCun, Yann ;
Bengio, Yoshua ;
Hinton, Geoffrey .
NATURE, 2015, 521 (7553) :436-444
[6]   Networks of spiking neurons: The third generation of neural network models [J].
Maass, W .
NEURAL NETWORKS, 1997, 10 (09) :1659-1671
[7]   Sparse coding with an overcomplete basis set: A strategy employed by V1? [J].
Olshausen, BA ;
Field, DJ .
VISION RESEARCH, 1997, 37 (23) :3311-3325
[8]  
Panda P, 2016, ABS160201510 CORR
[9]  
Rozell C, 2007, 2007 IEEE INT C IM P, V4
[10]  
Tang P. T. P, 2017, ABS170505475 CORR