Statistical mechanics of complex neural systems and high dimensional data

被引:43
作者
Advani, Madhu [1 ]
Lahiri, Subhaneil [1 ]
Ganguli, Surya [1 ]
机构
[1] Stanford Univ, Dept Appl Phys, Stanford, CA 94305 USA
来源
JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT | 2013年
关键词
cavity and replica method; spin glasses (theory); message-passing algorithms; computational neuroscience; RANDOMLY ASYMMETRIC BONDS; STORAGE CAPACITY; INFORMATION-STORAGE; SOLVABLE MODEL; SPIN SYSTEMS; NETWORK; DYNAMICS; REPRESENTATIONS; PERCEPTRON; EQUATIONS;
D O I
10.1088/1742-5468/2013/03/P03014
中图分类号
O3 [力学];
学科分类号
08 ; 0801 ;
摘要
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
引用
收藏
页数:66
相关论文
共 169 条
[1]  
ALBUS JS, 1971, MATH BIOSCI, V10, P26, DOI DOI 10.1016/0025-5564(71)90051-4
[2]  
AMALDI E, 1991, ARTIFICIAL NEURAL NETWORKS, VOLS 1 AND 2, P55
[3]   SPIN-GLASS MODELS OF NEURAL NETWORKS [J].
AMIT, DJ ;
GUTFREUND, H .
PHYSICAL REVIEW A, 1985, 32 (02) :1007-1018
[4]   STORING INFINITE NUMBERS OF PATTERNS IN A SPIN-GLASS MODEL OF NEURAL NETWORKS [J].
AMIT, DJ ;
GUTFREUND, H ;
SOMPOLINSKY, H .
PHYSICAL REVIEW LETTERS, 1985, 55 (14) :1530-1533
[5]   STATISTICAL-MECHANICS OF NEURAL NETWORKS NEAR SATURATION [J].
AMIT, DJ ;
GUTFREUND, H ;
SOMPOLINSKY, H .
ANNALS OF PHYSICS, 1987, 173 (01) :30-67
[6]  
[Anonymous], ARXIV09063234
[7]  
[Anonymous], 2001, STAT PHYS SPIN GLASS
[8]  
[Anonymous], 2011, The oxford handbook of random matrix theory
[9]  
[Anonymous], 2004, Semantic cognition: A parallel distributed processing approach
[10]  
[Anonymous], 1991, Principles of Neural Science