Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

被引:0
|
作者
Gal, Yarin [1 ]
van der Wilk, Mark [1 ]
Rasmussen, Carl E. [1 ]
机构
[1] Univ Cambridge, Cambridge, England
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014) | 2014年 / 27卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of research. We introduce a novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm. This is done by exploiting the decoupling of the data given the inducing points to re-formulate the evidence lower bound in a Map-Reduce setting. We show that the inference scales well with data and computational resources, while preserving a balanced distribution of the load among the nodes. We further demonstrate the utility in scaling Gaussian processes to big data. We show that GP performance improves with increasing amounts of data in regression (on flight data with 2 million records) and latent variable modelling (on MNIST). The results show that GPs perform better than many common models often used for big data.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] A review on Gaussian Process Latent Variable Models
    Li, Ping
    Chen, Songcan
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2016, 1 (04) : 366 - +
  • [22] Ensembles of Gaussian process latent variable models
    Ajirak, Marzieh
    Liu, Yuhao
    Djuric, Petar M.
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1467 - 1471
  • [23] Contraction rates for sparse variational approximations in Gaussian process regression
    Nieman, Dennis
    Szabo, Botond
    van Zanten, Harry
    Journal of Machine Learning Research, 2022, 23
  • [24] Contraction rates for sparse variational approximations in Gaussian process regression
    Nieman, Dennis
    Szabo, Botond
    van Zanten, Harry
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [25] Variational Inference for Gaussian Process Models for Survival Analysis
    Kim, Minyoung
    Pavlovic, Vladimir
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 435 - 445
  • [26] Efficient inference for sparse latent variable models of transcriptional regulation
    Dai, Zhenwen
    Iqbal, Mudassar
    Lawrence, Neil D.
    Rattray, Magnus
    BIOINFORMATICS, 2017, 33 (23) : 3776 - 3783
  • [27] Variational Inference for Gaussian Process Models with Linear Complexity
    Cheng, Ching-An
    Boots, Byron
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [28] Perturbative Corrections for Approximate Inference in Gaussian Latent Variable Models
    Opper, Manfred
    Paquet, Ulrich
    Winther, Ole
    JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 2857 - 2898
  • [29] Gaussian Mixture Modeling with Gaussian Process Latent Variable Models
    Nickisch, Hannes
    Rasmussen, Carl Edward
    PATTERN RECOGNITION, 2010, 6376 : 272 - 282
  • [30] Brain Shape Correspondence Analysis Using Variational Mixtures for Gaussian Process Latent Variable Models
    Minoli, Juan P., V
    Orozco, Alvaro A.
    Porras-Hurtado, Gloria L.
    Garcia, Hernan F.
    ARTIFICIAL INTELLIGENCE IN NEUROSCIENCE: AFFECTIVE ANALYSIS AND HEALTH APPLICATIONS, PT I, 2022, 13258 : 547 - 556