A Soft-Pruning Method Applied During Training of Spiking Neural Networks for In-memory Computing Applications

被引:31
作者
Shi, Yuhan [1 ]
Nguyen, Leon [1 ]
Oh, Sangheon [1 ]
Liu, Xin [1 ]
Kuzum, Duygu [1 ]
机构
[1] Univ Calif San Diego, Elect & Comp Engn Dept, San Diego, CA 92103 USA
基金
美国国家科学基金会;
关键词
spiking neural networks; unsupervised learning; handwriting recognition; pruning; in-memory computing; emerging non-volatile memory; SYNAPSE; NEURONS;
D O I
10.3389/fnins.2019.00405
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Inspired from the computational efficiency of the biological brain, spiking neural networks (SNNs) emulate biological neural networks, neural codes, dynamics, and circuitry. SNNs show great potential for the implementation of unsupervised learning using in-memory computing. Here, we report an algorithmic optimization that improves energy efficiency of online learning with SNNs on emerging non-volatile memory (eNVM) devices. We develop a pruning method for SNNs by exploiting the output firing characteristics of neurons. Our pruning method can be applied during network training, which is different from previous approaches in the literature that employ pruning on already-trained networks. This approach prevents unnecessary updates of network parameters during training. This algorithmic optimization can complement the energy efficiency of eNVM technology, which offers a unique in-memory computing platform for the parallelization of neural network operations. Our SNN maintains similar to 90% classification accuracy on the MNIST dataset with up to similar to 75% pruning, significantly reducing the number of weight updates. The SNN and pruning scheme developed in this work can pave the way toward applications of eNVM based neuro-inspired systems for energy efficient online learning in low power applications.
引用
收藏
页数:13
相关论文
共 55 条
[1]   Memristors Empower Spiking Neurons With Stochasticity [J].
Al-Shedivat, Maruan ;
Naous, Rawan ;
Cauwenberghs, Gert ;
Salama, Khaled Nabil .
IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2015, 5 (02) :242-253
[2]   Pattern classification by memristive crossbar circuits using ex situ and in situ training [J].
Alibart, Fabien ;
Zamanidoost, Elham ;
Strukov, Dmitri B. .
NATURE COMMUNICATIONS, 2013, 4
[3]   RESPARC: A Reconfigurable and Energy-Efficient Architecture with Memristive Crossbars for Deep Spiking Neural Networks [J].
Ankit, Aayush ;
Sengupta, Abhronil ;
Panda, Priyadarshini ;
Roy, Kaushik .
PROCEEDINGS OF THE 2017 54TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2017,
[4]  
[Anonymous], IEEE IJCNN
[5]  
[Anonymous], ARXIV170303854
[6]  
[Anonymous], 2016, ARXIV160208323
[7]   Switching operation and degradation of resistive random access memory composed of tungsten oxide and copper investigated using in-situ TEM [J].
Arita, Masashi ;
Takahashi, Akihito ;
Ohno, Yuuki ;
Nakane, Akitoshi ;
Tsurumaki-Fukuchi, Atsushi ;
Takahashi, Yasuo .
SCIENTIFIC REPORTS, 2015, 5
[8]   Learning real-world stimuli in a neural network with spike-driven synaptic dynamics [J].
Brader, Joseph M. ;
Senn, Walter ;
Fusi, Stefano .
NEURAL COMPUTATION, 2007, 19 (11) :2881-2912
[9]   Experimental Demonstration and Tolerancing of a Large-Scale Neural Network (165 000 Synapses) Using Phase-Change Memory as the Synaptic Weight Element [J].
Burr, Geoffrey W. ;
Shelby, Robert M. ;
Sidler, Severin ;
di Nolfo, Carmelo ;
Jang, Junwoo ;
Boybat, Irem ;
Shenoy, Rohit S. ;
Narayanan, Pritish ;
Virwani, Kumar ;
Giacometti, Emanuele U. ;
Kuerdi, Bulent N. ;
Hwang, Hyunsang .
IEEE TRANSACTIONS ON ELECTRON DEVICES, 2015, 62 (11) :3498-3507
[10]   Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition [J].
Cao, Yongqiang ;
Chen, Yang ;
Khosla, Deepak .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 113 (01) :54-66