Scalable Inference for Gaussian Process Models with Black-Box Likelihoods

被引:0
作者
Dezfouli, Amir [1 ]
Bonilla, Edwin V. [1 ]
机构
[1] Univ New South Wales, Sydney, NSW, Australia
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015) | 2015年 / 28卷
基金
澳大利亚研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a sparse method for scalable automated variational inference (AVI) in a large class of models with Gaussian process (GP) priors, multiple latent functions, multiple outputs and non-linear likelihoods. Our approach maintains the statistical efficiency property of the original AVI method, requiring only expectations over univariate Gaussian distributions to approximate the posterior with a mixture of Gaussians. Experiments on small datasets for various problems including regression, classification, Log Gaussian Cox processes, and warped GPs show that our method can perform as well as the full method under high sparsity levels. On larger experiments using the MNIST and the SARCOS datasets we show that our method can provide superior performance to previously published scalable approaches that have been handcrafted to specific likelihood models.
引用
收藏
页数:9
相关论文
共 29 条
  • [1] Alvarez M., 2010, EFFICIENT MULTIOUTPU, V9
  • [2] Alvarez MA, 2011, J MACH LEARN RES, V12, P1459
  • [3] [Anonymous], 2015, AISTATS
  • [4] [Anonymous], 2008, JMLR
  • [5] [Anonymous], NIPS
  • [6] [Anonymous], AISTATS
  • [7] [Anonymous], UAI
  • [8] [Anonymous], 2009, AISTATS
  • [9] [Anonymous], UAI
  • [10] [Anonymous], AISTATS