Transfer Entropy for Coupled Autoregressive Processes

被引:19
作者
Hahs, Daniel W. [1 ]
Pethel, Shawn D. [2 ]
机构
[1] Torch Technol Inc, Huntsville, AL 35802 USA
[2] USA, Huntsville, AL 35898 USA
关键词
transfer entropy; autoregressive process; Gaussian process; information transfer; INFORMATION-TRANSFER;
D O I
10.3390/e15030767
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A method is shown for computing transfer entropy over multiple time lags for coupled autoregressive processes using formulas for the differential entropy of multivariate Gaussian processes. Two examples are provided: (1) a first-order filtered noise process whose state is measured with additive noise, and (2) two first-order coupled processes each of which is driven by white process noise. We found that, for the first example, increasing the first-order AR coefficient while keeping the correlation coefficient between filtered and measured process fixed, transfer entropy increased since the entropy of the measured process was itself increased. For the second example, the minimum correlation coefficient occurs when the process noise variances match. It was seen that matching of these variances results in minimum information flow, expressed as the sum of transfer entropies in both directions. Without a match, the transfer entropy is larger in the direction away from the process having the larger process noise. Fixing the process noise variances, transfer entropies in both directions increase with the coupling strength. Finally, we note that the method can be generally employed to compute other information theoretic quantities as well.
引用
收藏
页码:767 / 788
页数:22
相关论文
共 13 条
[1]  
[Anonymous], 2010, ARXIV10042515
[2]   Information flows in causal networks [J].
Ay, Nihat ;
Polani, Daniel .
ADVANCES IN COMPLEX SYSTEMS, 2008, 11 (01) :17-41
[3]   Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables [J].
Barnett, Lionel ;
Barrett, Adam B. ;
Seth, Anil K. .
PHYSICAL REVIEW LETTERS, 2009, 103 (23)
[4]   When Two Become One: The Limits of Causality Analysis of Brain Dynamics [J].
Chicharro, Daniel ;
Ledberg, Anders .
PLOS ONE, 2012, 7 (03)
[5]  
Cover T. M., 1999, Elements of information theory
[6]   Time's Barbed Arrow: Irreversibility, Crypticity, and Stored Information [J].
Crutchfield, James P. ;
Ellison, Christopher J. ;
Mahoney, John R. .
PHYSICAL REVIEW LETTERS, 2009, 103 (09)
[7]   Evaluating information transfer between auditory cortical neurons [J].
Gourevitch, Boris ;
Eggermont, Jos J. .
JOURNAL OF NEUROPHYSIOLOGY, 2007, 97 (03) :2533-2543
[8]   Distinguishing Anticipation from Causality: Anticipatory Bias in the Estimation of Information Flow [J].
Hahs, Daniel W. ;
Pethel, Shawn D. .
PHYSICAL REVIEW LETTERS, 2011, 107 (12)
[9]   Information transfer in continuous processes [J].
Kaiser, A ;
Schreiber, T .
PHYSICA D-NONLINEAR PHENOMENA, 2002, 166 (1-2) :43-62
[10]  
Kotz S., 2000, CONTINUOUS MULTIVARI, V1