Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition

被引:10
作者
Bill, Johannes [1 ]
Buesing, Lars [2 ]
Habenschuss, Stefan [1 ]
Nessler, Bernhard [3 ]
Maass, Wolfgang [1 ]
Legenstein, Robert [1 ]
机构
[1] Graz Univ Technol, Inst Theoret Comp Sci, Graz, Austria
[2] Columbia Univ, Dept Stat, New York, NY USA
[3] Frankfurt Inst Adv Studies, Frankfurt, Germany
基金
奥地利科学基金会;
关键词
EXCITATORY NEURONS; VISUAL-CORTEX; CONNECTIONS; MODEL; ORIENTATION; SPECIFICITY; INTEGRATION; PLASTICITY; NETWORKS; DYNAMICS;
D O I
10.1371/journal.pone.0134356
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input.
引用
收藏
页数:51
相关论文
共 86 条
[1]   Lateral competition for cortical space by layer-specific horizontal circuits [J].
Adesnik, Hillel ;
Scanziani, Massimo .
NATURE, 2010, 464 (7292) :1155-U71
[2]   Multisensory integration: psychophysics, neurophysiology, and computation [J].
Angelaki, Dora E. ;
Gu, Yong ;
DeAngelis, Gregory C. .
CURRENT OPINION IN NEUROBIOLOGY, 2009, 19 (04) :452-458
[3]  
[Anonymous], TRENDS NEUROSCIENCES
[4]  
[Anonymous], ARXIV13113211
[5]  
[Anonymous], NEURON
[6]  
[Anonymous], NEURON
[7]  
[Anonymous], 2012, ADV NEURAL INF PROCE
[8]  
[Anonymous], TRENDS NEUROSCIENCES
[9]  
[Anonymous], NIPS
[10]  
[Anonymous], PLOS COMPUT BIOL