Comparing methods for statistical inference with model uncertainty

被引:20
作者
Porwal, Anupreet [1 ]
Raftery, Adrian E. [1 ,2 ]
机构
[1] Univ Washington, Dept Stat, Seattle, WA 98195 USA
[2] Univ Washington, Dept Sociol, Seattle, WA 98195 USA
关键词
Bayesian model averaging; interval estimation; LASSO; model selection; parameter estimation; VARIABLE SELECTION; REGRESSION SHRINKAGE; PRIORS; REGULARIZATION; PERFORMANCE; HORSESHOE; LASSO;
D O I
10.1073/pnas.2120737119
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Probability models are used for many statistical tasks, notably parameter estimation, interval estimation, inference about model parameters, point prediction, and interval prediction. Thus, choosing a statistical model and accounting for uncertainty about this choice are important parts of the scientific process. Here we focus on one such choice, that of variables to include in a linear regression model. Many methods have been proposed, including Bayesian and penalized likelihood methods, and it is unclear which one to use. We compared 21 of the most popular methods by carrying out an extensive set of simulation studies based closely on real datasets that span a range of situations encountered in practical data analysis. Three adaptive Bayesian model averaging (BMA) methods performed best across all statistical tasks. These used adaptive versions of Zellner's g-prior for the parameters, where the prior variance parameter g is a function of sample size or is estimated from the data. We found that for BMA methods implemented with Markov chain Monte Carlo, 10,000 iterations were enough. Computationally, we found two of the three best methods (BMA with g = root n, and empirical Bayes-local) to be competitive with the least absolute shrinkage and selection operator (LASSO), which is often preferred as a variable selection technique because of its computational efficiency. BMA performed better than Bayesian model selection (in which just one model is selected).
引用
收藏
页数:8
相关论文
共 80 条
[1]  
Akaike H., 1983, Int. Stat. Inst, V44, P277
[2]  
[Anonymous], 1978, SPECIFICATION SEARCH
[3]  
[Anonymous], 1986, Bayesian Inference and Decision Techniques
[4]  
Bartlett M, 1957, Biometrika, V44, P533, DOI DOI 10.1093/BIOMET/44.3-4.533
[5]   Lasso Meets Horseshoe: A Survey [J].
Bhadra, Anindya ;
Datta, Jyotishka ;
Polson, Nicholas G. ;
Willard, Brandon .
STATISTICAL SCIENCE, 2019, 34 (03) :405-427
[6]   COORDINATE DESCENT ALGORITHMS FOR NONCONVEX PENALIZED REGRESSION, WITH APPLICATIONS TO BIOLOGICAL FEATURE SELECTION [J].
Breheny, Patrick ;
Huang, Jian .
ANNALS OF APPLIED STATISTICS, 2011, 5 (01) :232-253
[7]  
Burnham K. P., 2002, Model selection and multimodel inference: A practical informationtheoretic approach
[8]   The horseshoe estimator for sparse signals [J].
Carvalho, Carlos M. ;
Polson, Nicholas G. ;
Scott, James G. .
BIOMETRIKA, 2010, 97 (02) :465-480
[9]   Regularization in Regression: Comparing Bayesian and Frequentist Methods in a Poorly Informative Situation [J].
Celeux, Gilles ;
El Anbari, Mohammed ;
Marin, Jean-Michel ;
Robert, Christian P. .
BAYESIAN ANALYSIS, 2012, 7 (02) :477-502
[10]   Flexible empirical Bayes estimation for wavelets [J].
Clyde, M ;
George, EI .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2000, 62 :681-698