Bayesian inference: an approach to statistical inference

被引:1
作者
Fraser, D. A. S. [1 ]
机构
[1] Univ Toronto, Dept Stat, Toronto, ON M5S 3G3, Canada
来源
WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS | 2010年 / 2卷 / 04期
关键词
confidence; conjugate prior; default prior; invariant prior; likelihood;
D O I
10.1002/wics.102
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The original Bayes used an analogy involving an invariant prior and a statistical model and argued that the resulting combination of prior with likelihood provided a probability description of an unknown parameter value in an application; the combination in particular contexts with invariance can currently be called a confidence distribution and is subject to some restrictions when used to construct confidence intervals and regions. The procedure of using a prior with likelihood has now, however, been widely generalized with invariance being extended to less restrictive criteria such as non-informative, reference, and more. Other generalizations are to allow the prior to represent various forms of background information that is available or elicited from those familiar with the statistical context; these can reasonably be called subjective priors. Still further generalizations address an anomaly where marginalization with a vector parameter gives results that contradict the term probability; these are Dawid, Stone, Zidek marginalization paradoxes; various priors for this are called targeted priors. A special case where the prior describes a random source for the parameter value is however just probability analysis but is frequently treated as a Bayes procedure. We survey the argument in support of probability characteristics and outline various generalizations of the original Bayes proposal. (C) 2010 John Wiley & Sons, Inc.
引用
收藏
页码:487 / 496
页数:10
相关论文
共 50 条
  • [41] Improved Variational Bayesian Phylogenetic Inference with Normalizing Flows
    Zhang, Cheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [42] Bayesian inference using WBDev: A tutorial for social scientists
    Wetzels, Ruud
    Lee, Michael D.
    Wagenmakers, Eric-Jan
    BEHAVIOR RESEARCH METHODS, 2010, 42 (03) : 884 - 897
  • [43] Bayesian joint inference for multiple directed acyclic graphs
    Lee, Kyoungjae
    Cao, Xuan
    JOURNAL OF MULTIVARIATE ANALYSIS, 2022, 191
  • [44] Bayesian inference for circular distributions with unknown normalising constants
    Bhattacharya, Sourabh
    SenGupta, Ashis
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2009, 139 (12) : 4179 - 4192
  • [45] Principles of Bayesian Inference Using General Divergence Criteria
    Jewson, Jack
    Smith, Jim Q.
    Holmes, Chris
    ENTROPY, 2018, 20 (06)
  • [46] Bayesian inference for some mixture problems in quality and reliability
    Nair, VN
    Tang, B
    Yu, LA
    JOURNAL OF QUALITY TECHNOLOGY, 2001, 33 (01) : 16 - 28
  • [47] Bayesian and frequentist inference derived from the maximum entropy principle with applications to propagating uncertainty about statistical methods
    Bickel, David R.
    STATISTICAL PAPERS, 2024, 65 (08) : 5389 - 5407
  • [48] Statistical Inference for a General Family of Modified Exponentiated Distributions
    Gomez-Deniz, Emilio
    Iriarte, Yuri A.
    Gomez, Yolanda M.
    Barranco-Chamorro, Inmaculada
    Gomez, Hector W.
    MATHEMATICS, 2021, 9 (23)
  • [49] Statistical inference on group Rasch mixture network models
    Long, Yuhang
    Huang, Tao
    STAT, 2022, 11 (01):
  • [50] Statistical inference for the effect of magnetic brain stimulation on a motoneurone
    Ventura, V
    Davison, AC
    Boniface, SJ
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES C-APPLIED STATISTICS, 1998, 47 : 77 - 94