Mismatched Estimation and Relative Entropy

被引:66
作者
Verdu, Sergio [1 ]
机构
[1] Princeton Univ, Dept Elect Engn, Princeton, NJ 08544 USA
基金
美国国家科学基金会;
关键词
Divergence; free probability; minimum mean-square error (MMSE) estimation; mutual information; relative entropy; Shannon theory; statistics; FREE PROBABILITY-THEORY; FISHER INFORMATION MEASURE; MEAN-SQUARE ERROR; MUTUAL INFORMATION; GAUSSIAN CHANNELS; POWER INEQUALITY; SIMPLE PROOF; ANALOGS; PERTURBATION;
D O I
10.1109/TIT.2010.2050800
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A random variable with distribution is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the distribution is Q, instead of. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P parallel to Q) (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.
引用
收藏
页码:3712 / 3720
页数:9
相关论文
共 32 条
[1]  
[Anonymous], 1981, Information Theory: Coding Theorems for Discrete Memoryless Systems
[2]  
[Anonymous], 2006, ENCYCL MATH SCI
[3]   Free diffusions, free entropy and free Fisher information [J].
Biane, P ;
Speicher, R .
ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES, 2001, 37 (05) :581-606
[4]  
BIANE P, 1998, 1998040 MSRI
[5]   Divergence and minimum mean-square error in continuous-time additive white Gaussian noise channels [J].
Binia, J .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (03) :1160-1163
[6]  
Blower Gordon., 2009, Random matrices: high dimensional phenomena, volume 367 of London Mathematical Society Lecture Note Series
[7]  
Dupuis P., 1997, A weak convergence approach to the theory of large deviations
[8]  
Guo D., 2009, P IEEE INT S INF THE, P814
[9]  
GUO D, 2008, P IEEE INT S INF THE, P1083
[10]   Mutual information and minimum mean-square error in Gaussian channels [J].
Guo, DN ;
Shamai, S ;
Verdú, S .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (04) :1261-1282