Mixture Models, Bayes Fisher Information, and Divergence Measures

被引:29
作者
Asadi, Majid [1 ,2 ]
Ebrahimi, Nader [3 ]
Kharazmi, Omid [1 ]
Soofi, Ehsan S. [4 ]
机构
[1] Univ Isfahan, Dept Stat, Esfahan 81744, Iran
[2] Inst Res Fundamental Sci IPM, Sch Math, Tehran 193955746, Iran
[3] Northern Illinois Univ, Div Stat, De Kalb, IL 60155 USA
[4] Univ Wisconsin, Lubar Sch Business, Milwaukee, WI 53201 USA
关键词
Fisher information; Jeffreys divergence; Jensen-Shannon divergence; Kullback-Leibler divergence; Renyi divergence; Tsallis divergence;
D O I
10.1109/TIT.2018.2877608
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents the Bayes Fisher information measures, defined by the expected Fisher information under a distribution for the parameter, for the arithmetic, geometric, and generalized mixtures of two probability density functions. The Fisher information of the arithmetic mixture about the mixing parameter is related to chi-square divergence, Shannon entropy, and the Jensen-Shannon divergence. The Bayes Fisher measures of the three mixture models are related to the Kullback-Leibler, Jeffreys, Jensen-Shannon, Renyi, and Tsallis divergences. These measures indicate that the farther away are the components from each other, the more informative are data about the mixing parameter. We also unify three different relative entropy derivations of the geometric mixture scattered in statistics and physics literatures. Extensions of two of the formulations to the minimization of Tsallis divergence give the generalized mixture as the solution.
引用
收藏
页码:2316 / 2321
页数:6
相关论文
共 16 条
[1]  
[Anonymous], 2006, Elements of information theory
[2]   A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians [J].
Bercher, J-F. .
PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2012, 391 (19) :4460-4469
[3]  
Gill R. D., 1995, Bernoulli, V1, P59
[4]   Binomial and Poisson distributions as maximum entropy distributions [J].
Harremoës, P .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2001, 47 (05) :2039-2041
[5]   On optimality properties of the power prior [J].
Ibrahim, JG ;
Chen, MH ;
Sinha, D .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2003, 98 (461) :204-213
[6]  
Kapur JN., 1989, MAXIMUM ENTROPY MODE
[7]  
Kullback S., 1959, Information Theory and Statistics
[8]   On divergences and informations in statistics and information theory [J].
Liese, Friedrich ;
Vajda, Igor .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (10) :4394-4412
[9]   DIVERGENCE MEASURES BASED ON THE SHANNON ENTROPY [J].
LIN, JH .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1991, 37 (01) :145-151
[10]   ON A MEASURE OF THE INFORMATION PROVIDED BY AN EXPERIMENT [J].
LINDLEY, DV .
ANNALS OF MATHEMATICAL STATISTICS, 1956, 27 (04) :986-1005