A measure for intrinsic information

被引:0
作者
Leonardo S. Barbosa
William Marshall
Sabrina Streipert
Larissa Albantakis
Giulio Tononi
机构
[1] University of Wisconsin-Madison,Department of Psychiatry
[2] Brock University,Department of Mathematics and Statistics
[3] McMaster University,Department of Mathematics and Statistics
来源
Scientific Reports | / 10卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
We introduce an information measure that reflects the intrinsic perspective of a receiver or sender of a single symbol, who has no access to the communication channel and its source or target. The measure satisfies three desired properties—causality, specificity, intrinsicality—and is shown to be unique. Causality means that symbols must be transmitted with probability greater than chance. Specificity means that information must be transmitted by an individual symbol. Intrinsicality means that a symbol must be taken as such and cannot be decomposed into signal and noise. It follows that the intrinsic information carried by a specific symbol increases if the repertoire of symbols increases without noise (expansion) and decreases if it does so without signal (dilution). An optimal balance between expansion and dilution is relevant for systems whose elements must assess their inputs and outputs from the intrinsic perspective, such as neurons in a network.
引用
收藏
相关论文
共 17 条
  • [1] Shannon CE(1948)A mathematical theory of communication Bell Syst. Tech. J. 27 379-423
  • [2] Prokopenko M(2009)An information-theoretic primer on complexity, self-organization, and emergence Complexity 15 11-28
  • [3] Boschetti F(2016)Toward massive, ultrareliable, and low-latency wireless communication with short packets Proc. IEEE 104 1711-1726
  • [4] Ryan AJ(2008)Axiomatic characterizations of information measures Entropy 10 261-273
  • [5] Durisi G(1974)Why the Shannon and Hartley entropies are “Natural” Adv. Appl. Prob. 6 131-146
  • [6] Koch T(2014)Rényi divergence and Kullback–Leibler divergence IEEE Trans. Inf. Theory 60 3797-3820
  • [7] Popovski P(1970)Generalized information functions Inf. Control 16 36-51
  • [8] Csiszár I(1967)Quantification method of classification processes. Concept of structural α-entropy Kybernetika 3 30-35
  • [9] Aczél J(1998)Generalized entropy-based criterion for consistent testing Phys. Rev. E 58 1442-1445
  • [10] Forte B(undefined)undefined undefined undefined undefined-undefined