Bayesian sparse graphical models and their mixtures

被引:9
作者
Talluri, Rajesh [1 ]
Baladandayuthapani, Veerabhadran [1 ]
Mallick, Bani K. [2 ]
机构
[1] Univ Texas MD Anderson Canc Ctr, Dept Biostat, Houston, TX 77030 USA
[2] Texas A&M Univ, Dept Stat, College Stn, TX 77843 USA
来源
STAT | 2014年 / 3卷 / 01期
关键词
Bayesian; covariance selection; finite mixtures; Gaussian graphical models; infinite mixtures; sparse modelling;
D O I
10.1002/sta4.49
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We propose Bayesian methods for Gaussian graphical models that lead to sparse and adaptively shrunk estimators of the precision (inverse covariance) matrix. Our methods are based on lasso-type regularization priors leading to parsimonious parameterization of the precision matrix, which is essential in several applications involving learning relationships among the variables. In this context, we introduce a novel type of selection prior that develops a sparse structure on the precision matrix by making most of the elements exactly zero, in addition to ensuring positive definiteness-thus conducting model selection and estimation simultaneously. More importantly, we extend these methods to analyse clustered data using finite mixtures of Gaussian graphical model and infinite mixtures of Gaussian graphical models. We discuss appropriate posterior simulation schemes to implement posterior inference in the proposed models, including the evaluation of normalizing constants that are functions of parameters of interest, which result from the restriction of positive definiteness on the correlation matrix. We evaluate the operating characteristics of our method via several simulations and demonstrate the application to real-data examples in genomics. Copyright (C) 2014 John Wiley & Sons, Ltd.
引用
收藏
页码:109 / 125
页数:17
相关论文
共 52 条