JEFFREYS PRIOR IS ASYMPTOTICALLY LEAST FAVORABLE UNDER ENTROPY RISK

被引:194
作者
CLARKE, BS
BARRON, AR
机构
[1] UNIV BRITISH COLUMBIA,DEPT STAT,VANCOUVER V6T 122,BC,CANADA
[2] YALE UNIV,NEW HAVEN,CT 06520
关键词
BAYES RISK; MINIMAX RISK; KULLBACK-LEIBLER INFORMATION; JEFFREYS PRIOR; FISHER INFORMATION; SHANNONS MUTUAL INFORMATION; PARAMETRIC DENSITY ESTIMATION; DATA COMPRESSION; REFERENCE PRIORS; LEAST FAVORABLE PRIORS;
D O I
10.1016/0378-3758(94)90153-8
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We provide a rigorous proof that Jeffreys' prior asymptotically maximizes Shannon's mutual information between a sample of size n and the parameter. This was conjectured by Bernardo (1979) and, despite the absence of a proof, forms the basis of the reference prior method in Bayesian statistical analysis. Our proof rests on an examination of large sample decision theoretic properties associated with the relative entropy or the Kullback-Leibler distance between probability density functions for independent and identically distributed random variables. For smooth finite-dimensional parametric families we derive an asymptotic expression for the minimax risk and for the related maximin risk. As a result, we show that, among continuous positive priors, Jeffreys' prior uniquely achieves the asymptotic maximin value. In the discrete parameter case we show that, asymptotically, the Bayes risk reduces to the entropy of the prior so that the reference prior is seen to be the maximum entropy prior. We identify the physical significance of the risks by giving two information-theoretic interpretations in terms of probabilistic coding.
引用
收藏
页码:37 / 60
页数:24
相关论文
共 33 条