Additivity of Information in Multilayer Networks via Additive Gaussian Noise Transforms

被引:0
作者
Reeves, Galen [1 ,2 ]
机构
[1] Duke Univ, Dept ECE, Durham, NC 27706 USA
[2] Duke Univ, Dept Stat Sci, Durham, NC 27706 USA
来源
2017 55TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON) | 2017年
关键词
MEAN-SQUARE ERROR; MUTUAL INFORMATION; RANDOM MATRICES; CDMA; RECOVERY; CHANNELS; CAPACITY;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multilayer (or deep) networks are powerful probabilistic models based on multiple stages of a linear transform followed by a non-linear (possibly random) function. In general, the linear transforms are defined by matrices and the non-linear functions are defined by information channels. These models have gained great popularity due to their ability to characterize complex probabilistic relationships arising in a wide variety of inference problems. The contribution of this paper is a new method for analyzing the fundamental limits of statistical inference in settings where the model is known. The validity of our method can be established in a number of settings and is conjectured to hold more generally. A key assumption made throughout is that the matrices are drawn randomly from orthogonally invariant distributions. Our method yields explicit formulas for 1) the mutual information; 2) the minimum mean-squared error (MMSE); 3) the existence and locations of certain phase-transitions with respect to the problem parameters; and 4) the stationary points for the state evolution of approximate message passing algorithms. When applied to the special case of models with multivariate Gaussian channels our method is rigorous and has close connections to free probability theory for random matrices. When applied to the general case of non-Gaussian channels, our method provides a simple alternative to the replica method from statistical physics. A key observation is that the combined effects of the individual components in the model (namely the matrices and the channels) are additive when viewed in a certain transform domain.
引用
收藏
页码:1064 / 1070
页数:7
相关论文
共 29 条
[1]  
[Anonymous], 2011, IEEE INT S INF THEOR
[2]  
[Anonymous], 2004, RANDOM MATRIX THEORY
[3]  
[Anonymous], 2016, VECTOR APPROXIMATE M
[4]  
Barbier J., 2017, PHASE TRANSITIONS OP
[5]  
Barbier Jean, 2016, P ANN ALL C COMM CON
[6]   The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing [J].
Bayati, Mohsen ;
Montanari, Andrea .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (02) :764-785
[7]  
Çakmak B, 2014, INFO THEOR WORKSH, P192, DOI 10.1109/ITW.2014.6970819
[8]   Message-passing algorithms for compressed sensing [J].
Donoho, David L. ;
Maleki, Arian ;
Montanari, Andrea .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2009, 106 (45) :18914-18919
[9]  
Fletcher A.K., 2017, INFERENCE DEEP NETWO
[10]  
Guo D., 2009, P ANN ALL C COMM CON