An ensemble unsupervised spiking neural network for objective recognition

被引:24
作者
Fu, Qiang [1 ]
Dong, Hongbin [1 ]
机构
[1] Harbin Engn Univ, Coll Comp Sci & Technol, Harbin 150001, Peoples R China
基金
中国国家自然科学基金;
关键词
Spiking neural network; STDP; Unsupervised training; Ensemble learning; Transfer learning;
D O I
10.1016/j.neucom.2020.07.109
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It is now known that the spiking neuron is a basic unit of spiking neural networks (SNNs). Spiking neu-rons modulate the nervous cells via receiving external incentives, generation of action potential and firing spikes. The SNNs usually used for pattern recognition tasks or complex computation depending on the brain-like characteristic. Although the SNNs have no advantages comparing with the deep neural networks in terms of classification accuracy, the SNNs have more characteristics of biological neurons. In this paper, a hierarchical SNN, comprising convolutional and pooling layers, is designed. The proposed SNN consists of excitatory and inhibitory neurons based on the mechanism of the primate brain. A temporal coding (rank order) manner is used to encode the input patterns. It depends on the rank of the spike arrival on post synapses to establish the priority of input spikes for a particular pattern. The spike-timing dependent plasticity (STDP) learning rule is used in convolutional layers to extract visual features in an unsupervised learning manner. During the classification stage, a lateral inhibition mechanism is used to prevent the non-firing neurons and produce distinguishable results. In order to improve the performance of our SNN, an ensemble SNN architecture using the voting method is proposed, and transfer learning is used to avoid re-training the SNN when solving the different tasks. The hand-written digits classification task on MNIST, CIFAR-10, and BreaKHis databases are used to verify the performance of the proposed SNN. Experimental results show that by using the ensemble architecture and transfer learning, the classification accuracy of 99.27% for the MNIST database, overall accuracy is 93% for the CIFAR-10 database, and overall accuracy is 96.97% for BreaKHis database. In the meantime, this work achieves a better performance than the benchmarking approaches. Taken together, the results of our work suggest that the ensemble SNN architecture with transfer learning is key to improving the performance of the SNN. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:47 / 58
页数:12
相关论文
共 46 条
[1]   Interpretation of convolutional neural networks for speech spectrogram regression from intracranial recordings [J].
Angrick, Miguel ;
Herff, Christian ;
Johnson, Garett ;
Shih, Jerry ;
Krusienski, Dean ;
Schultz, Tanja .
NEUROCOMPUTING, 2019, 342 :145-151
[2]  
[Anonymous], 2018, GUIDE CONVOLUTION AR
[3]  
[Anonymous], 2015, ARXIV151008829CSLG
[4]  
[Anonymous], 2013, FRONTIERS NEUROSCIEN
[5]   An ensemble neural network model for real-time prediction of urban floods [J].
Berkhahn, Simon ;
Fuchs, Lothar ;
Neuweiler, Insa .
JOURNAL OF HYDROLOGY, 2019, 575 :743-754
[6]   Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition [J].
Cao, Yongqiang ;
Chen, Yang ;
Khosla, Deepak .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 113 (01) :54-66
[7]   Ensemble Neural Networks (ENN): A gradient-free stochastic method [J].
Chen, Yuntian ;
Chang, Haibin ;
Meng, Jin ;
Zhang, Dongxiao .
NEURAL NETWORKS, 2019, 110 :170-185
[8]   Contrastive Hebbian learning with random feedback weights [J].
Detorakis, Georgios ;
Bartley, Travis ;
Neftci, Emre .
NEURAL NETWORKS, 2019, 114 :1-14
[9]  
Diehl PU, 2015, IEEE IJCNN
[10]   Convolutional networks for fast, energy-efficient neuromorphic computing [J].
Esser, Steven K. ;
Merolla, Paul A. ;
Arthur, John V. ;
Cassidy, Andrew S. ;
Appuswamy, Rathinakumar ;
Andreopoulos, Alexander ;
Berg, David J. ;
McKinstry, Jeffrey L. ;
Melano, Timothy ;
Barch, Davis R. ;
di Nolfo, Carmelo ;
Datta, Pallab ;
Amir, Arnon ;
Taba, Brian ;
Flickner, Myron D. ;
Modha, Dharmendra S. .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2016, 113 (41) :11441-11446