Bayesian predictive inference under informative sampling and transformation

被引:4
|
作者
Nandram, Balgobin
Choi, Jai Won
Shen, Gang
Burgos, Corinne
机构
[1] Worcester Polytech Inst, Dept Math Sci, Worcester, MA 01609 USA
[2] Natl Ctr Hlth Stat, Off Res & Methodol, Hyattsville, MD 20782 USA
[3] Purdue Univ, Dept Stat, W Lafayette, IN 47907 USA
关键词
establishment survey; Gibbs sampler; Horvitz-Thompson estimator; non-ignorable model; selection probabilities; transformation to normality;
D O I
10.1002/asmb.650
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider the problem in which a biased sample is selected from a finite population (a random sample from a super-population), and inference is required for the finite population mean and the super-population mean. The selection probabilities are linearly related to the measurements, providing a nonignorable selection model. When all the selection probabilities are known, as in our problem, inference about the finite population mean and the super-population mean can be made. As a practical issue, our method requires normality, but the measurements are not necessarily normally distributed. Thus, the key issue is the dilemma that a transformation to normality is needed, but this transformation will destroy the linearity between the selection probabilities and the measurements. This is the key issue we address in this work. We use the Gibbs sampler and the sample importance resampling algorithm to fit the non-ignorable selection model to a simple example on natural gas production. Our non-ignorable selection model estimates the finite population mean production much closer to the true finite population mean than a model which ignores the selection probabilities, and there is improved precision of the non-ignorable selection model over this latter model. A naive 95% credible interval based on the Horvitz-Thompson estimator is too wide. Copyright (c) 2006 John Wiley & Sons, Ltd.
引用
收藏
页码:559 / 572
页数:14
相关论文
共 50 条
  • [1] Bayesian Predictive Inference For Finite Population Quantities Under Informative Sampling
    Ma, Junheng
    Sedransk, Joe
    Nandram, Balgobin
    Chen, Lu
    STATISTICS AND APPLICATIONS, 2018, 16 (01): : 207 - 226
  • [2] Approximate Bayesian inference under informative sampling
    Wang, Z.
    Kim, J. K.
    Yang, S.
    BIOMETRIKA, 2018, 105 (01) : 91 - 102
  • [3] Bayesian Inference for Repeated Measures Under Informative Sampling
    Savitsky, Terrance D.
    Leon-Novelo, Luis G.
    Engle, Helen
    JOURNAL OF OFFICIAL STATISTICS, 2024, 40 (01) : 161 - 189
  • [4] Scalable Approximate Bayesian Inference for Outlier Detection under Informative Sampling
    Savitsky, Terrance D.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [5] BAYESIAN PREDICTIVE INFERENCE UNDER SEQUENTIAL SAMPLING WITH SELECTION BIAS
    Nandram, B.
    Kim, D.
    SOME RECENT ADVANCES IN MATHEMATICS & STATISTICS, 2013, : 169 - 186
  • [6] Bayesian estimation under informative sampling
    Savitsky, Terrance D.
    Toth, Daniell
    ELECTRONIC JOURNAL OF STATISTICS, 2016, 10 (01): : 1677 - 1708
  • [7] Fully Bayesian estimation under informative sampling
    Leon-Novelo, Luis G.
    Savitsky, Terrance D.
    ELECTRONIC JOURNAL OF STATISTICS, 2019, 13 (01): : 1608 - 1645
  • [8] Pseudo Bayesian Mixed Models under Informative Sampling
    Savitsky, Terrance D.
    Williams, Matthew R.
    JOURNAL OF OFFICIAL STATISTICS, 2022, 38 (03) : 901 - 928
  • [9] Bayesian Estimation Under Informative Sampling with Unattenuated Dependence
    Williams, Matthew R.
    Savitsky, Terrance D.
    BAYESIAN ANALYSIS, 2020, 15 (01): : 57 - 77
  • [10] Bayesian pairwise estimation under dependent informative sampling
    Williams, Matthew R.
    Savitsky, Terrance D.
    ELECTRONIC JOURNAL OF STATISTICS, 2018, 12 (01): : 1631 - 1661