Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

被引:0
|
作者
Gal, Yarin [1 ]
van der Wilk, Mark [1 ]
Rasmussen, Carl E. [1 ]
机构
[1] Univ Cambridge, Cambridge, England
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014) | 2014年 / 27卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of research. We introduce a novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm. This is done by exploiting the decoupling of the data given the inducing points to re-formulate the evidence lower bound in a Map-Reduce setting. We show that the inference scales well with data and computational resources, while preserving a balanced distribution of the load among the nodes. We further demonstrate the utility in scaling Gaussian processes to big data. We show that GP performance improves with increasing amounts of data in regression (on flight data with 2 million records) and latent variable modelling (on MNIST). The results show that GPs perform better than many common models often used for big data.
引用
收藏
页数:9
相关论文
共 50 条
  • [11] Incremental Variational Sparse Gaussian Process Regression
    Cheng, Ching-An
    Boots, Byron
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [12] Variational Inference for Sparse Gaussian Process Modulated Hawkes Process
    Zhang, Rui
    Walder, Christian
    Rizoiu, Marian-Andrei
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6803 - 6810
  • [13] Asynchronous Distributed Variational Gaussian Process for Regression
    Peng, Hao
    Zhe, Shandian
    Qi, Yuan
    Zhang, Xiao
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [14] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    Gadd, C.
    Wade, S.
    Shah, A. A.
    MACHINE LEARNING, 2021, 110 (06) : 1105 - 1143
  • [15] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    C. Gadd
    S. Wade
    A. A. Shah
    Machine Learning, 2021, 110 : 1105 - 1143
  • [16] Rates of Convergence for Sparse Variational Gaussian Process Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [17] Automated Variational Inference for Gaussian Process Models
    Nguyen, Trung, V
    Bonilla, Edwin, V
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [18] Generic inference in latent Gaussian process models
    Bonilla, Edwin V.
    Krauth, Karl
    Dezfouli, Amir
    Journal of Machine Learning Research, 2019, 20
  • [19] Generic Inference in Latent Gaussian Process Models
    Bonilla, Edwin V.
    Krauth, Karl
    Dezfouli, Amir
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [20] A Gaussian Process Latent Variable Model for BRDF Inference
    Georgoulis, Stamatios
    Vanweddingen, Vincent
    Proesmans, Marc
    Van Gool, Luc
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 3559 - 3567