Variational Bayesian Inference in High-Dimensional Linear Mixed Models

被引:5
作者
Yi, Jieyi [1 ]
Tang, Niansheng [1 ]
机构
[1] Yunnan Univ, Yunnan Key Lab Stat Modeling & Data Anal, Kunming 650091, Yunnan, Peoples R China
基金
中国国家自然科学基金;
关键词
Bayesian lasso; evidence lower bound; high-dimensional linear mixed model; spike and slab priors; variational Bayesian inference; STRUCTURAL EQUATION MODELS; VARIABLE SELECTION; NORMALIZING CONSTANTS; GIBBS; LIKELIHOOD;
D O I
10.3390/math10030463
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In high-dimensional regression models, the Bayesian lasso with the Gaussian spike and slab priors is widely adopted to select variables and estimate unknown parameters. However, it involves large matrix computations in a standard Gibbs sampler. To solve this issue, the Skinny Gibbs sampler is employed to draw observations required for Bayesian variable selection. However, when the sample size is much smaller than the number of variables, the computation is rather time-consuming. As an alternative to the Skinny Gibbs sampler, we develop a variational Bayesian approach to simultaneously select variables and estimate parameters in high-dimensional linear mixed models under the Gaussian spike and slab priors of population-specific fixed-effects regression coefficients, which are reformulated as a mixture of a normal distribution and an exponential distribution. The coordinate ascent algorithm, which can be implemented efficiently, is proposed to optimize the evidence lower bound. The Bayes factor, which can be computed with the path sampling technique, is presented to compare two competing models in the variational Bayesian framework. Simulation studies are conducted to assess the performance of the proposed variational Bayesian method. An empirical example is analyzed by the proposed methodologies.
引用
收藏
页数:19
相关论文
共 52 条
[1]   MOMENT-BASED METHOD FOR RANDOM EFFECTS SELECTION IN LINEAR MIXED MODELS [J].
Ahn, Mihye ;
Zhang, Hao Helen ;
Lu, Wenbin .
STATISTICA SINICA, 2012, 22 (04) :1539-1562
[2]  
Attias H, 2000, ADV NEUR IN, V12, P209
[3]  
Beal M. J., 2003, Variational algorithms for approximate bayesian inference
[4]  
Berger J.O., 1992, BAYESIAN ANAL STAT E, P323
[5]  
Bishop C., 2006, Pattern Recognition and Machine Learning
[6]   Variational Inference: A Review for Statisticians [J].
Blei, David M. ;
Kucukelbir, Alp ;
McAuliffe, Jon D. .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2017, 112 (518) :859-877
[7]   Joint Variable Selection for Fixed and Random Effects in Linear Mixed-Effects Models [J].
Bondell, Howard D. ;
Krishna, Arun ;
Ghosh, Sujit K. .
BIOMETRICS, 2010, 66 (04) :1069-1077
[8]   Fixed Effects Testing in High-Dimensional Linear Mixed Models [J].
Bradic, Jelena ;
Claeskens, Gerda ;
Gueuning, Thomas .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2020, 115 (532) :1835-1850
[9]   Forecasting the global burden of Alzheimer's disease [J].
Brookmeyer, Ron ;
Johnson, Elizabeth ;
Ziegler-Graham, Kathryn ;
Arrighi, H. Michael .
ALZHEIMERS & DEMENTIA, 2007, 3 (03) :186-191
[10]  
Chen YT, 2016, J MACH LEARN RES, V17