Mutual information (MI) is being widely used to analyze the neural code in a variety of stochastic neuronal sensory systems. Unfortunately, MI is analytically tractable only for simple coding problems. One way to address this difficulty is to relate MI to Fisher information which is relatively easier to compute and interpret with regard to neurophysiological parameters. The relationship between the two measures is not always clear and often depends on the probability distribution function that best describes the distribution of the noise. Using Stam's inequality we show here that deviations from Gaussianity in neuronal response distribution function can result in a large overestimation of MI, even in the small noise regime. This result is especially relevant when studying neural codes represented by Poissonian neurons. (C) 2019 Elsevier B.V. All rights reserved.
机构:
Univ Grenoble Alpes, BP 166, F-38042 Grenoble, France
CNRS, LPMMC UMR 5493, BP 166, F-38042 Grenoble, FranceUniv Grenoble Alpes, BP 166, F-38042 Grenoble, France
机构:
Univ Roma Tor Vergata, Dipartimento SEFEMEQ, Fac Econ, I-00133 Rome, ItalyUniv Roma Tor Vergata, Dipartimento SEFEMEQ, Fac Econ, I-00133 Rome, Italy
Gibilisco, Paolo
Imparato, Daniele
论文数: 0引用数: 0
h-index: 0
机构:
Politecn Torino, Dipartimento Matemat, I-10129 Turin, ItalyUniv Roma Tor Vergata, Dipartimento SEFEMEQ, Fac Econ, I-00133 Rome, Italy
Imparato, Daniele
Isola, Tommaso
论文数: 0引用数: 0
h-index: 0
机构:
Univ Roma Tor Vergata, Dipartimento Matemat, I-00133 Rome, ItalyUniv Roma Tor Vergata, Dipartimento SEFEMEQ, Fac Econ, I-00133 Rome, Italy