Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

被引:0
|
作者
Gal, Yarin [1 ]
van der Wilk, Mark [1 ]
Rasmussen, Carl E. [1 ]
机构
[1] Univ Cambridge, Cambridge, England
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014) | 2014年 / 27卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of research. We introduce a novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm. This is done by exploiting the decoupling of the data given the inducing points to re-formulate the evidence lower bound in a Map-Reduce setting. We show that the inference scales well with data and computational resources, while preserving a balanced distribution of the load among the nodes. We further demonstrate the utility in scaling Gaussian processes to big data. We show that GP performance improves with increasing amounts of data in regression (on flight data with 2 million records) and latent variable modelling (on MNIST). The results show that GPs perform better than many common models often used for big data.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Estimation and inference in sparse multivariate regression and conditional Gaussian graphical models under an unbalanced distributed setting
    Nezakati, Ensiyeh
    Pircalabelu, Eugen
    ELECTRONIC JOURNAL OF STATISTICS, 2024, 18 (01): : 599 - 652
  • [32] Variable selection for Gaussian process regression through a sparse projection
    Park, Chiwoo
    Borth, David J.
    Wilson, Nicholas S.
    Hunter, Chad N.
    IISE TRANSACTIONS, 2022, 54 (07) : 699 - 712
  • [33] Manifold Denoising with Gaussian Process Latent Variable Models
    Gao, Yan
    Chan, Kap Luk
    Yau, Wei-Yun
    19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 3719 - 3722
  • [34] Applications of Gaussian Process Latent Variable Models in Finance
    Nirwan, Rajbir S.
    Bertschinger, Nils
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 2, 2020, 1038 : 1209 - 1221
  • [35] Gaussian process latent variable models for fault detection
    Eciolaza, Luka
    Alkarouri, A.
    Lawrence, N. D.
    Kadirkamanathan, V.
    Fleming, P. J.
    2007 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DATA MINING, VOLS 1 AND 2, 2007, : 287 - 292
  • [36] Scalable Variational Bayesian Kernel Selection for Sparse Gaussian Process Regression
    Teng, Tong
    Chen, Jie
    Zhang, Yehong
    Low, Kian Hsiang
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5997 - 6004
  • [37] Multimodal Gaussian Process Latent Variable Models with Harmonization
    Song, Guoli
    Wang, Shuhui
    Huang, Qingming
    Tian, Qi
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5039 - 5047
  • [38] Tracking the Dimensions of Latent Spaces of Gaussian Process Latent Variable Models
    Liu, Yuhao
    Djuric, Petar M.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4193 - 4197
  • [39] Sparse Orthogonal Variational Inference for Gaussian Processes
    Shi, Jiaxin
    Titsias, Michalis K.
    Mnih, Andriy
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [40] Bayesian covariance estimation and inference in latent Gaussian process models
    Earls, Cecilia
    Hooker, Giles
    STATISTICAL METHODOLOGY, 2014, 18 : 79 - 100