Bayesian Priors from Loss Matching

被引:4
作者
Brown, Philip J. [1 ]
Walker, Stephen G. [1 ]
机构
[1] Univ Kent, Sch Math Stat & Actuarial Sci, Canterbury, Kent, England
关键词
Conjugate prior; Dirichlet process; Kullback-Leibler divergence; loss function; model choice; -open; prior distribution; self-information loss; POSTERIOR DISTRIBUTIONS; INFORMATION; PROBABILITY; INFERENCE; BEHAVIOR;
D O I
10.1111/j.1751-5823.2011.00176.x
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper is concerned with the construction of prior probability measures for parametric families of densities where the framework is such that only beliefs or knowledge about a single observable data point is required. We pay particular attention to the parameter which minimizes a measure of divergence to the distribution providing the data. The prior distribution reflects this attention and we discuss the application of the Bayes rule from this perspective. Our framework is fundamentally non-parametric and we are able to interpret prior distributions on the parameter space using ideas of matching loss functions, one of which is coming from the data model and the other from the prior.
引用
收藏
页码:60 / 82
页数:23
相关论文
共 50 条