Geometric Insights into the Multivariate Gaussian Distribution and Its Entropy and Mutual Information

被引:6
|
作者
Jwo, Dah-Jing [1 ]
Cho, Ta-Shun [2 ]
Biswal, Amita [1 ]
机构
[1] Natl Taiwan Ocean Univ, Dept Commun Nav & Control Engn, 2 Peining Rd, Keelung, Taiwan
[2] Asia Univ, Dept Business Adm, 500 Liufeng Rd, Taichung 41354, Taiwan
关键词
multivariate Gaussians; correlated random variables; visualization; entropy; relative entropy; mutual information; RANDOM SIGNALS; CHANNEL; FORMULA;
D O I
10.3390/e25081177
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In this paper, we provide geometric insights with visualization into the multivariate Gaussian distribution and its entropy and mutual information. In order to develop the multivariate Gaussian distribution with entropy and mutual information, several significant methodologies are presented through the discussion, supported by illustrations, both technically and statistically. The paper examines broad measurements of structure for the Gaussian distributions, which show that they can be described in terms of the information theory between the given covariance matrix and correlated random variables (in terms of relative entropy). The content obtained allows readers to better perceive concepts, comprehend techniques, and properly execute software programs for future study on the topic's science and implementations. It also helps readers grasp the themes' fundamental concepts to study the application of multivariate sets of data in Gaussian distribution. The simulation results also convey the behavior of different elliptical interpretations based on the multivariate Gaussian distribution with entropy for real-world applications in our daily lives, including information coding, nonlinear signal detection, etc. Involving the relative entropy and mutual information as well as the potential correlated covariance analysis, a wide range of information is addressed, including basic application concerns as well as clinical diagnostics to detect the multi-disease effects.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Testing the mutual information expansion of entropy with multivariate Gaussian distributions
    Goethe, Martin
    Fita, Ignacio
    Miguel Rubi, J.
    JOURNAL OF CHEMICAL PHYSICS, 2017, 147 (22):
  • [2] A mutual information based distance for multivariate Gaussian processes
    Boets, Jeroen
    De Cock, Katrien
    De Moor, Bart
    MODELING, ESTIMATION AND CONTROL: FESTSCHRIFT IN HONOR OF GIORGIO PICCI ON THE OCCASION OF THE SIXTY-FIFTH BIRTHDAY, 2007, 364 : 15 - +
  • [3] A mutual information based distance for multivariate Gaussian processes
    Boets, Jeroen
    De Cock, Katrien
    De Moor, Bart
    PROCEEDINGS OF THE 46TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-14, 2007, : 5330 - 5335
  • [4] MULTIVARIATE MUTUAL INFORMATION - SAMPLING DISTRIBUTION WITH APPLICATIONS
    GUERRERO, JL
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1994, 23 (05) : 1319 - 1339
  • [5] Relative Entropy and Mutual Information in Gaussian Statistical Field Theory
    Schroefl, Markus
    Floerchinger, Stefan
    ANNALES HENRI POINCARE, 2024,
  • [6] Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions
    Arellano-Valle, Reinaldo B.
    Contreras-Reyes, Javier E.
    Genton, Marc G.
    SCANDINAVIAN JOURNAL OF STATISTICS, 2013, 40 (01) : 42 - 62
  • [7] Geometric k-nearest neighbor estimation of entropy and mutual information
    Lord, Warren M.
    Sun, Jie
    Bollt, Erik M.
    CHAOS, 2018, 28 (03)
  • [8] Hessian and Concavity of Mutual Information, Differential Entropy, and Entropy Power in Linear Vector Gaussian Channels
    Payaro, Miquel
    Palomar, Daniel P.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2009, 55 (08) : 3613 - 3628
  • [9] PARTIAL MUTUAL INFORMATION FOR SIMPLE MODEL ORDER DETERMINATION IN MULTIVARIATE EEG SIGNALS AND ITS APPLICATION TO TRANSFER ENTROPY
    Zhu, J.
    Jeannes, R. Le Bouquin
    Yang, C.
    Bellanger, J. J.
    Shu, H.
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 673 - 676
  • [10] Generalizing geometric partition entropy for the estimation of mutual information in the presence of informative outliers
    Diggans, C. Tyler
    Almomani, Abd AlRahman R.
    CHAOS, 2025, 35 (03)