Geometric Insights into the Multivariate Gaussian Distribution and Its Entropy and Mutual Information

被引:10
作者
Jwo, Dah-Jing [1 ]
Cho, Ta-Shun [2 ]
Biswal, Amita [1 ]
机构
[1] Natl Taiwan Ocean Univ, Dept Commun Nav & Control Engn, 2 Peining Rd, Keelung, Taiwan
[2] Asia Univ, Dept Business Adm, 500 Liufeng Rd, Taichung 41354, Taiwan
关键词
multivariate Gaussians; correlated random variables; visualization; entropy; relative entropy; mutual information; RANDOM SIGNALS; CHANNEL; FORMULA;
D O I
10.3390/e25081177
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In this paper, we provide geometric insights with visualization into the multivariate Gaussian distribution and its entropy and mutual information. In order to develop the multivariate Gaussian distribution with entropy and mutual information, several significant methodologies are presented through the discussion, supported by illustrations, both technically and statistically. The paper examines broad measurements of structure for the Gaussian distributions, which show that they can be described in terms of the information theory between the given covariance matrix and correlated random variables (in terms of relative entropy). The content obtained allows readers to better perceive concepts, comprehend techniques, and properly execute software programs for future study on the topic's science and implementations. It also helps readers grasp the themes' fundamental concepts to study the application of multivariate sets of data in Gaussian distribution. The simulation results also convey the behavior of different elliptical interpretations based on the multivariate Gaussian distribution with entropy for real-world applications in our daily lives, including information coding, nonlinear signal detection, etc. Involving the relative entropy and mutual information as well as the potential correlated covariance analysis, a wide range of information is addressed, including basic application concerns as well as clinical diagnostics to detect the multi-disease effects.
引用
收藏
页数:24
相关论文
共 50 条
[31]   Mutual Information Variational Autoencoders and Its Application to Feature Extraction of Multivariate Time Series [J].
Li, Junying ;
Ren, Weijie ;
Han, Min .
INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2022, 36 (06)
[32]   Estimation of the entropy of a multivariate normal distribution [J].
Misra, N ;
Singh, H ;
Demchuk, E .
JOURNAL OF MULTIVARIATE ANALYSIS, 2005, 92 (02) :324-342
[33]   Mutual information as a measure of multivariate association: analytical properties and statistical estimation [J].
Blumentritt, Thomas ;
Schmid, Friedrich .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2012, 82 (09) :1257-1274
[34]   On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel [J].
Zakai, M .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (09) :3017-3024
[35]   The Sampling Distribution of the Total Correlation for Multivariate Gaussian Random Variables [J].
Rowe, Taylor ;
Day, Troy .
ENTROPY, 2019, 21 (10)
[36]   Breakdown of a concavity property of mutual information for non-Gaussian channels [J].
Kireeva, Anastasia ;
Mourrat, Jean-Christophe .
INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2024, 13 (02)
[37]   Entropy Power, Autoregressive Models, and Mutual Information [J].
Gibson, Jerry .
ENTROPY, 2018, 20 (10)
[38]   Non-Parametric Estimation of Mutual Information through the Entropy of the Linkage [J].
Giraudo, Maria Teresa ;
Sacerdote, Laura ;
Sirovich, Roberta .
ENTROPY, 2013, 15 (12) :5154-5177
[39]   Entropy, mutual information, and systematic measures of structured spiking neural networks [J].
Li, Wenjie ;
Li, Yao .
JOURNAL OF THEORETICAL BIOLOGY, 2020, 501
[40]   Belavkin-Staszewski Relative Entropy, Conditional Entropy, and Mutual Information [J].
Zhai, Yuan ;
Yang, Bo ;
Xi, Zhengjun .
ENTROPY, 2022, 24 (06)