Effects of synaptic connectivity on liquid state machine performance

被引:36
作者
Ju, Han [1 ,3 ]
Xu, Jian-Xin [2 ]
Chong, Edmund [1 ]
VanDongen, Antonius M. J. [1 ]
机构
[1] Duke NUS Grad Med Sch, Program Neurosci & Behav Disorders, Singapore, Singapore
[2] Natl Univ Singapore, Dept Elect & Comp Engn, Singapore 117548, Singapore
[3] Natl Univ Singapore, Grad Sch Integrat Sci & Engn, Singapore 117548, Singapore
关键词
Liquid state machine; Genetic algorithm; Neural microcircuit optimization; Spatiotemporal pattern classification; NEURAL-NETWORKS; SYNAPSES; PATTERNS; BRAIN;
D O I
10.1016/j.neunet.2012.11.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Liquid State Machine (LSM) is a biologically plausible computational neural network model for real-time computing on time-varying inputs, whose structure and function were inspired by the properties of neocortical columns in the central nervous system of mammals. The LSM uses spiking neurons connected by dynamic synapses to project inputs into a high dimensional feature space, allowing classification of inputs by linear separation, similar to the approach used in support vector machines (SVMs). The performance of a LSM neural network model on pattern recognition tasks mainly depends on its parameter settings. Two parameters are of particular interest: the distribution of synaptic strengths and synaptic connectivity. To design an efficient liquid filter that performs desired kernel functions, these parameters need to be optimized. We have studied performance as a function of these parameters for several models of synaptic connectivity. The results show that in order to achieve good performance, large synaptic weights are required to compensate for a small number of synapses in the liquid filter, and vice versa. In addition, a larger variance of the synaptic weights results in better performance for LSM benchmark problems. We also propose a genetic algorithm-based approach to evolve the liquid filter from a minimum structure with no connections, to an optimized kernel with a minimal number of synapses and high classification accuracy. This approach facilitates the design of an optimal LSM with reduced computational complexity. Results obtained using this genetic programming approach show that the synaptic weight distribution after evolution is similar in shape to that found in cortical circuitry. (c) 2012 Elsevier Ltd. All rights reserved.
引用
收藏
页码:39 / 51
页数:13
相关论文
共 42 条
[1]   A resilient, low-frequency, small-world human brain functional network with highly connected association cortical hubs [J].
Achard, S ;
Salvador, R ;
Whitcher, B ;
Suckling, J ;
Bullmore, ET .
JOURNAL OF NEUROSCIENCE, 2006, 26 (01) :63-72
[2]   Classes of small-world networks [J].
Amaral, LAN ;
Scala, A ;
Barthélémy, M ;
Stanley, HE .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2000, 97 (21) :11149-11152
[3]  
Baker JL, 2003, NEUROCOMPUTING, V52-4, P199, DOI [10.1016/S0925-2312(02)00733-6, 10.1016/SO925-2312(02)00733-6]
[4]   Pattern separation in the human hippocampal CA3 and dentate gyrus [J].
Bakker, Arnold ;
Kirwan, C. Brock ;
Miller, Michael ;
Stark, Craig E. L. .
SCIENCE, 2008, 319 (5870) :1640-1642
[5]   What can we learn from synaptic weight distributions? [J].
Barbour, Boris ;
Brunel, Nicolas ;
Hakim, Vincent ;
Nadal, Jean-Pierre .
TRENDS IN NEUROSCIENCES, 2007, 30 (12) :622-629
[6]   Optimal information storage and the distribution of synaptic weights: Perceptron versus Purkinje cell [J].
Brunel, N ;
Hakim, V ;
Isope, P ;
Nadal, JP ;
Barbour, B .
NEURON, 2004, 43 (05) :745-757
[7]   Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex [J].
Gupta, A ;
Wang, Y ;
Markram, H .
SCIENCE, 2000, 287 (5451) :273-278
[8]  
Hamed H, 2010, AUST J INTEL INF PRO, V11, P23
[9]  
Hamed HNA, 2011, 2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), P2653, DOI 10.1109/IJCNN.2011.6033565
[10]  
Hamed Haza Nuzly Abdull, 2012, INT J MODELING OPTIM, V2, P187