Some notes on mutual information between past and future

被引:0
作者
Li, LM [1 ]
机构
[1] Univ So Calif, Los Angeles, CA 90089 USA
关键词
mutual information; Toeplitz; entropy; reflection coefficient; long-memory;
D O I
暂无
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We present some new results on the mutual information between past and future for Gaussian stationary sequences. We provide several formulae to calculate this quantity. As a by-product, we establish the so-called reflectrum identity that links partial autocorrelation coefficients and cepstrum coefficients. So as to obtain these results, we provide an account of several regularity conditions for Gaussian stationary processes in terms of properties of the associated Toeplitz and Hankel operators. We discuss conditions under which the mutual information is finite. These results lead us to an interesting perspective towards the definition of long-memory processes. Our result implies that zeros on the unit circle can cause mutual information to be infinite. Examples include fractional autoregressive integrated moving average (ARIMA) models. In addition, we consider a finite sample from a Gaussian stationary sequence. In the expansion of the determinant of its covariance matrix, the Toeplitz matrix, the first and second term are, entropy and mutual information respectively. A form of approximation to the likelihood using entropy and mutual information is presented.
引用
收藏
页码:309 / 322
页数:14
相关论文
共 50 条
  • [1] A Mutual Information Inequality and Some Applications
    Lau, Chin Wa
    Nair, Chandra
    Ng, David
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (10) : 6210 - 6220
  • [2] On Some Properties of the Mutual Information Between Extrinsics With Application to Iterative Decoding
    Alberge, Florence
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2015, 63 (05) : 1541 - 1553
  • [3] Some relations between mutual information and estimation error in wiener space
    Mayer-Wolf, Eddy
    Zakai, Moshe
    ANNALS OF APPLIED PROBABILITY, 2007, 17 (03) : 1102 - 1116
  • [4] Mutual Information between Order Book Layers
    Libman, Daniel
    Ariel, Gil
    Schaps, Mary
    Haber, Simi
    ENTROPY, 2022, 24 (03)
  • [5] The Mutual Information between Graphs
    Escolano, Francisco
    Hancock, Edwin R.
    2014 22ND INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2014, : 94 - 99
  • [6] The mutual information between graphs
    Escolano, Francisco
    Hancock, Edwin R.
    Lozano, Miguel A.
    Curado, Manuel
    PATTERN RECOGNITION LETTERS, 2017, 87 : 12 - 19
  • [7] Weak Mutual Information Between Functional Domains in Schizophrenia
    Salman, Mustafa S.
    Vergara, Victor M.
    Damaraju, Eswar
    Calhoun, Vince D.
    2018 CONFERENCE RECORD OF 52ND ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2018, : 1362 - 1366
  • [8] Some Notes on the Concepts of Entropy and Information
    G. V. Kurtseva
    Scientific and Technical Information Processing, 2020, 47 : 65 - 71
  • [9] Some Notes on the Concepts of Entropy and Information
    Kurtseva, G. V.
    SCIENTIFIC AND TECHNICAL INFORMATION PROCESSING, 2020, 47 (01) : 65 - 71
  • [10] On the Relationship Between Mutual Information and Bit Error Probability for Some Linear Dispersion Codes
    Jin, Xianglan
    Yang, Jae-Dong
    Song, Kyoung-Young
    No, Jong-Seon
    Shin, Dong-Joon
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2009, 8 (01) : 90 - 94