The Hidden Life of Latent Variables: Bayesian Learning with Mixed Graph Models

被引:0
作者
Silva, Ricardo [1 ]
Ghahramani, Zoubin [2 ,3 ]
机构
[1] UCL, Dept Stat Sci, London WC1E 6BT, England
[2] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
[3] Carnegie Mellon Univ, Machine Learning Dept, Pittsburgh, PA 15213 USA
关键词
graphical models; structural equation models; Bayesian inference; Markov chain Monte Carlo; latent variable models; INFERENCE; SELECTION; BINARY;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Directed acyclic graphs (DAGs) have been widely used as a representation of conditional independence in machine learning and statistics. Moreover, hidden or latent variables are often an important component of graphical models. However, DAG models suffer from an important limitation: the family of DAGs is not closed under marginalization of hidden variables. This means that in general we cannot use a DAG to represent the independencies over a subset of variables in a larger DAG. Directed mixed graphs (DMGs) are a representation that includes DAGs as a special case, and overcomes this limitation. This paper introduces algorithms for performing Bayesian inference in Gaussian and probit DMG models. An important requirement for inference is the specification of the distribution over parameters of the models. We introduce a new distribution for covariance matrices of Gaussian DMGs. We discuss and illustrate how several Bayesian machine learning tasks can benefit from the principle presented here: the power to model dependencies that are generated from hidden variables, but without necessarily modeling such variables explicitly.
引用
收藏
页码:1187 / 1238
页数:52
相关论文
共 54 条