Geometric Insights into the Multivariate Gaussian Distribution and Its Entropy and Mutual Information

被引:10
作者
Jwo, Dah-Jing [1 ]
Cho, Ta-Shun [2 ]
Biswal, Amita [1 ]
机构
[1] Natl Taiwan Ocean Univ, Dept Commun Nav & Control Engn, 2 Peining Rd, Keelung, Taiwan
[2] Asia Univ, Dept Business Adm, 500 Liufeng Rd, Taichung 41354, Taiwan
关键词
multivariate Gaussians; correlated random variables; visualization; entropy; relative entropy; mutual information; RANDOM SIGNALS; CHANNEL; FORMULA;
D O I
10.3390/e25081177
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In this paper, we provide geometric insights with visualization into the multivariate Gaussian distribution and its entropy and mutual information. In order to develop the multivariate Gaussian distribution with entropy and mutual information, several significant methodologies are presented through the discussion, supported by illustrations, both technically and statistically. The paper examines broad measurements of structure for the Gaussian distributions, which show that they can be described in terms of the information theory between the given covariance matrix and correlated random variables (in terms of relative entropy). The content obtained allows readers to better perceive concepts, comprehend techniques, and properly execute software programs for future study on the topic's science and implementations. It also helps readers grasp the themes' fundamental concepts to study the application of multivariate sets of data in Gaussian distribution. The simulation results also convey the behavior of different elliptical interpretations based on the multivariate Gaussian distribution with entropy for real-world applications in our daily lives, including information coding, nonlinear signal detection, etc. Involving the relative entropy and mutual information as well as the potential correlated covariance analysis, a wide range of information is addressed, including basic application concerns as well as clinical diagnostics to detect the multi-disease effects.
引用
收藏
页数:24
相关论文
共 50 条
[41]   MIA: Mutual Information Analyzer, a graphic user interface program that calculates entropy, vertical and horizontal mutual information of molecular sequence sets [J].
Lichtenstein, Flavio ;
Antoneli, Fernando, Jr. ;
Briones, Marcelo R. S. .
BMC BIOINFORMATICS, 2015, 16
[42]   MIA: Mutual Information Analyzer, a graphic user interface program that calculates entropy, vertical and horizontal mutual information of molecular sequence sets [J].
Flavio Lichtenstein ;
Fernando Antoneli ;
Marcelo R. S. Briones .
BMC Bioinformatics, 16
[43]   Gradient of mutual information in linear vector Gaussian channels [J].
Palomar, DP ;
Verdú, S .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (01) :141-154
[44]   Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration [J].
Cahill, Nathan D. .
BIOMEDICAL IMAGE REGISTRATION, 2010, 6204 :258-268
[45]   Calculation of Differential Entropy for a Mixed Gaussian Distribution [J].
Michalowicz, Joseph V. ;
Nichols, Jonathan M. ;
Bucholtz, Frank .
ENTROPY, 2008, 10 (03) :200-206
[46]   Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels [J].
Jiao, Jiantao ;
Venkat, Kartik ;
Weissman, Tsachy .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (10) :6662-6671
[47]   Mutual information matrix based on Renyi entropy and application [J].
Contreras-Reyes, Javier E. .
NONLINEAR DYNAMICS, 2022, 110 (01) :623-633
[48]   A Partial Information Decomposition for Multivariate Gaussian Systems Based on Information Geometry [J].
Kay, Jim W. .
ENTROPY, 2024, 26 (07)
[49]   Quantification of the Available Information of Hydrological Datasets: Stabilizing the Estimator of Multivariate Mutual Information [J].
Findanis, Evangelos ;
Loukas, Athanasios .
WATER RESOURCES MANAGEMENT, 2025, 39 (08) :4015-4032
[50]   Average mutual information for random fermionic Gaussian quantum states [J].
Hackl, Lucas ;
Kieburg, Mario ;
Maldonado, Joel .
JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2025, 58 (18)