Geometric Insights into the Multivariate Gaussian Distribution and Its Entropy and Mutual Information

被引:8
作者
Jwo, Dah-Jing [1 ]
Cho, Ta-Shun [2 ]
Biswal, Amita [1 ]
机构
[1] Natl Taiwan Ocean Univ, Dept Commun Nav & Control Engn, 2 Peining Rd, Keelung, Taiwan
[2] Asia Univ, Dept Business Adm, 500 Liufeng Rd, Taichung 41354, Taiwan
关键词
multivariate Gaussians; correlated random variables; visualization; entropy; relative entropy; mutual information; RANDOM SIGNALS; CHANNEL; FORMULA;
D O I
10.3390/e25081177
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In this paper, we provide geometric insights with visualization into the multivariate Gaussian distribution and its entropy and mutual information. In order to develop the multivariate Gaussian distribution with entropy and mutual information, several significant methodologies are presented through the discussion, supported by illustrations, both technically and statistically. The paper examines broad measurements of structure for the Gaussian distributions, which show that they can be described in terms of the information theory between the given covariance matrix and correlated random variables (in terms of relative entropy). The content obtained allows readers to better perceive concepts, comprehend techniques, and properly execute software programs for future study on the topic's science and implementations. It also helps readers grasp the themes' fundamental concepts to study the application of multivariate sets of data in Gaussian distribution. The simulation results also convey the behavior of different elliptical interpretations based on the multivariate Gaussian distribution with entropy for real-world applications in our daily lives, including information coding, nonlinear signal detection, etc. Involving the relative entropy and mutual information as well as the potential correlated covariance analysis, a wide range of information is addressed, including basic application concerns as well as clinical diagnostics to detect the multi-disease effects.
引用
收藏
页数:24
相关论文
共 50 条
[11]   Mutual information estimation based on Copula entropy [J].
Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian Liaoning 116023, China .
Kong Zhi Li Lun Yu Ying Yong, 2013, 7 (875-879) :875-879
[12]   GAUSSIAN MIXTURES: ENTROPY AND GEOMETRIC INEQUALITIES [J].
Eskenazis, Alexandros ;
Nayar, Piotr ;
Tkocz, Tomasz .
ANNALS OF PROBABILITY, 2018, 46 (05) :2908-2945
[13]   Estimating the errors on measured entropy and mutual information [J].
Roulston, MS .
PHYSICA D, 1999, 125 (3-4) :285-294
[14]   Mutual Information Is Copula Entropy [J].
马健 ;
孙增圻 .
TsinghuaScienceandTechnology, 2011, 16 (01) :51-54
[15]   A multivariate extension of mutual information for growing neural networks [J].
Ball, Kenneth R. ;
Grant, Christopher ;
Mundy, William R. ;
Shafer, Timothy J. .
NEURAL NETWORKS, 2017, 95 :29-43
[16]   Mutual Information and Relative Entropy of Sequential Effect Algebras [J].
汪加梅 ;
武俊德 ;
Cho Minhyung .
CommunicationsinTheoreticalPhysics, 2010, 54 (08) :215-218
[17]   Mutual Information, Relative Entropy, and Estimation in the Poisson Channel [J].
Atar, Rami ;
Weissman, Tsachy .
2011 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2011, :708-712
[18]   Exponential weighted entropy and exponential weighted mutual information [J].
Yu, Shiwei ;
Huang, Ting-Zhu .
NEUROCOMPUTING, 2017, 249 :86-94
[19]   IT Formulae for Gamma Target: Mutual Information and Relative Entropy [J].
Arras, Benjamin ;
Swan, Yvik .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (02) :1083-1091
[20]   Mutual Information and Relative Entropy of Sequential Effect Algebras [J].
Wang Jia-Mei ;
Wu Jun-De ;
Minhyung, Cho .
COMMUNICATIONS IN THEORETICAL PHYSICS, 2010, 54 (02) :215-218