Information Measures in Perspective

被引:69
作者
Ebrahimi, Nader [1 ]
Soofi, Ehsan S. [2 ]
Soyer, Refik [3 ]
机构
[1] No Illinois Univ, Div Stat, De Kalb, IL 60155 USA
[2] Univ Wisconsin, Sheldon B Lubar Sch Business, Milwaukee, WI 53201 USA
[3] George Washington Univ, Dept Decis Sci, Washington, DC 20052 USA
关键词
Bayesian information; dynamic information; entropy; Kullback-Leibler information; mutual information; MAXIMUM-ENTROPY; AXIOMATIC APPROACH; SHANNON ENTROPIES; INFERENCE; RENYI; DISCRIMINATION; EXPRESSIONS; UNIVARIATE; FRAMEWORK; GOODNESS;
D O I
10.1111/j.1751-5823.2010.00105.x
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
P>Information-theoretic methodologies are increasingly being used in various disciplines. Frequently an information measure is adapted for a problem, yet the perspective of information as the unifying notion is overlooked. We set forth this perspective through presenting information-theoretic methodologies for a set of problems in probability and statistics. Our focal measures are Shannon entropy and Kullback-Leibler information. The background topics for these measures include notions of uncertainty and information, their axiomatic foundation, interpretations, properties, and generalizations. Topics with broad methodological applications include discrepancy between distributions, derivation of probability models, dependence between variables, and Bayesian analysis. More specific methodological topics include model selection, limiting distributions, optimal prior distribution and design of experiment, modeling duration variables, order statistics, data disclosure, and relative importance of predictors. Illustrations range from very basic to highly technical ones that draw attention to subtle points.Resume Les methodologies issues de la Theorie de l'Information sont de plus en plus frequemment appliquees dans des disciplines diverses. Tres souvent, cependant, lorsqu'une mesure d'information est adaptee a un probleme donne, les perspectives unificatrices de la theorie dont elle est issue sont negligees. Nous exposons ici ces perspectives pour un ensemble de problemes rencontres en probabilite et en statistique. L'accent est mis sur l'entropie de Shannon et l'information de Kullback-Leibler. Les notions de base sur lesquelles s'appuient ces deux mesures comprennent les notions d'incertitude et d'information, leurs fondements axiomatiques, leur interpretation, leurs proprietes et leurs generalisations. Les applications visees sont, entre autres, les discordances entre lois de probabilites, la construction de modeles probabilistes, les mesures de dependance entre variables, et l'analyse bayesienne. De facon plus specifique, ces applications se rencontrent en selection de variables, l'etude de lois-limites, la caracterisation de lois a priori et de plans d'experiences optimaux, la modelisation des durees de vie, l'utilisation des statistiques d'ordre, et les performances relatives de predicteurs. Les illustrations vont des problemes les plus simples aux plus techniques, et permettent d'attirer l'attention sur certains points particulierement subtils.
引用
收藏
页码:383 / 412
页数:30
相关论文
共 127 条
[1]   Maximum entropy utility [J].
Abbas, AE .
OPERATIONS RESEARCH, 2006, 54 (02) :277-290
[2]   Information theoretic approach to statistical properties of multivariate Cauchy-Lorentz distributions [J].
Abe, S ;
Rajagopal, AK .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 2001, 34 (42) :8727-8731
[3]   TO SURVIVE OR TO FAIL - THAT IS THE QUESTION [J].
ABEL, PS ;
SINGPURWALLA, ND .
AMERICAN STATISTICIAN, 1994, 48 (01) :18-21
[4]   ON COHERENCE IN PARAMETRIC DENSITY-ESTIMATION [J].
AITCHISON, J .
BIOMETRIKA, 1990, 77 (04) :905-908
[5]  
AITCHISON J, 1975, BIOMETRIKA, V62, P547, DOI 10.1093/biomet/62.3.547
[6]  
Akaike H., 1973, 2 INTERNAT SYMPOS IN, P267, DOI [DOI 10.1007/978-1-4612-1694-0_15, 10.1007/978-1-4612-1694-0, 10.1007/978-1-4612-0919-5_38]
[7]   Information theoretic framework for process control [J].
Alwan, LC ;
Ebrahimi, N ;
Soofi, ES .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 1998, 111 (03) :526-542
[8]  
[Anonymous], P 4 BERK S
[9]  
[Anonymous], STATISTICIAN
[10]  
[Anonymous], MATH RELIABILITY EXP