On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels

被引:25
作者
Bustin, Ronit [1 ]
Payaro, Miquel [2 ]
Palomar, Daniel P. [3 ]
Shamai , Shlomo [1 ]
机构
[1] Technion Israel Inst Technol, Dept Elect Engn, IL-32000 Haifa, Israel
[2] Ctr Tecnol Telecomunicac Catalunya, Engn Unit, Barcelona 08860, Spain
[3] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Kowloon, Hong Kong, Peoples R China
基金
以色列科学基金会;
关键词
Entropy power inequality (EPI); Gaussian broadcast channel; Gaussian compound broadcast channel; Gaussian noise; I-MMSE; minimum mean square error (MMSE); multiple-input multiple-output (MIMO); mutual information; parallel vector channel; single crossing point; MUTUAL INFORMATION; CAPACITY REGION;
D O I
10.1109/TIT.2012.2225405
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The scalar additive Gaussian noise channel has the "single crossing point" property between the minimum mean square error (MMSE) in the estimation of the input given the channel output, assuming a Gaussian input to the channel, and the MMSE assuming an arbitrary input. This paper extends the result to the parallel vector additive Gaussian channel in three phases. 1) The channel matrix is the identity matrix, and we limit the Gaussian input to a vector of Gaussian i.i.d. elements. The "single crossing point" property is with respect to the signal-to-noise ratio (as in the scalar case). 2) The channel matrix is arbitrary, and the Gaussian input is limited to an independent Gaussian input. A "single crossing point" property is derived for each diagonal element of the MMSE matrix. 3) The Gaussian input is allowed to be an arbitrary Gaussian random vector. A "single crossing point" property is derived for each eigenvalue of the difference matrix between the two MMSE matrices. These three extensions are then translated to new information theoretic properties on the mutual information, using the I-MMSE relationship, a fundamental relationship between estimation theory and information theory revealed by Guo and coworkers. The results of the last phase are also translated to a new property of Fisher information. Finally, the applicability of all three extensions on information theoretic problems is demonstrated through a proof of a special case of Shannon's vector entropy power inequality, a converse proof of the capacity region of the parallel degraded broadcast channel (BC) under an input per-antenna power constraint and under an input covariance constraint, and a converse proof of the capacity region of the compound parallel degraded BC under an input covariance constraint.
引用
收藏
页码:818 / 844
页数:27
相关论文
共 32 条
[1]  
[Anonymous], 2006, Elements of Information Theory
[2]  
APOSTOL TM, 1969, CALCULUS MULTIVARIAB, V2
[3]   Mutual Information, Relative Entropy, and Estimation in the Poisson Channel [J].
Atar, Rami ;
Weissman, Tsachy .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2012, 58 (03) :1302-1318
[4]   SIMPLE CONVERSE FOR BROADCAST CHANNELS WITH ADDITIVE WHITE GAUSSIAN NOISE [J].
BERGMANS, PP .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1974, 20 (02) :279-280
[5]  
Bhatia R, 2007, PRINC SER APPL MATH, P1
[6]   Comments on broadcast channels [J].
Cover, TM .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1998, 44 (06) :2524-2530
[7]   ON CALCULATION OF MUTUAL INFORMATION [J].
DUNCAN, TE .
SIAM JOURNAL ON APPLIED MATHEMATICS, 1970, 19 (01) :215-&
[8]   The Secrecy Capacity Region of the Gaussian MIMO Multi-Receiver Wiretap Channel [J].
Ekrem, Ersen ;
Ulukus, Sennur .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (04) :2083-2114
[9]  
Geng YL, 2012, IEEE INT SYMP INFO, P586, DOI 10.1109/ISIT.2012.6284259
[10]  
Guo D., 2005, IEEE INT S INF THEOR