Non-Parametric Estimation of Mutual Information through the Entropy of the Linkage

被引:11
作者
Giraudo, Maria Teresa [1 ]
Sacerdote, Laura [1 ]
Sirovich, Roberta [1 ]
机构
[1] Univ Turin, Dept Math, I-10123 Turin, Italy
关键词
information measures; mutual information; entropy; copula function; linkage function; kernel method; binless estimator; MULTIVARIATE; DISTRIBUTIONS; NETWORKS;
D O I
10.3390/e15125154
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A new, non-parametric and binless estimator for the mutual information of a d-dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduces to the well known connection between mutual information and entropy of the copula function associated to the original random variables. Hence, the problem of estimating the mutual information of the original random vector is reduced to the estimation of the entropy of a random vector obtained through a multidimensional transformation. The estimator we propose is a two-step method: first estimate the transformation and obtain the transformed sample, then estimate its entropy. The properties of the new estimator are discussed through simulation examples and its performances are compared to those of the best estimators in the literature. The precision of the estimator converges to values of the same order of magnitude of the best estimator tested. However, the new estimator is unbiased even for larger dimensions and smaller sample sizes, while the other tested estimators show a bias in these cases.
引用
收藏
页码:5154 / 5177
页数:24
相关论文
共 34 条
[1]  
[Anonymous], 1995, MONOGRAPHS STAT APPL
[2]  
[Anonymous], 1986, MONOGR STAT APPL PRO
[3]  
[Anonymous], 2006, Elements of Information Theory
[4]  
[Anonymous], 1999, INTRO COPULAS
[5]  
[Anonymous], 1987, PROBLEMY PEREDACHI I
[6]  
Beirlant J, 1997, International Journal of Mathematical and Statistical Sciences, V6, P17
[7]  
Bishop CM., 1995, NEURAL NETWORKS PATT
[8]   Mutual information as a measure of multivariate association: analytical properties and statistical estimation [J].
Blumentritt, Thomas ;
Schmid, Friedrich .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2012, 82 (09) :1257-1274
[9]   Estimation of the information by an adaptive partitioning of the observation space [J].
Darbellay, GA ;
Vajda, I .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1999, 45 (04) :1315-1321
[10]   An estimator of the mutual information based on a criterion for independence [J].
Darbellay, GA .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 1999, 32 (01) :1-17