Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

被引:0
|
作者
Gal, Yarin [1 ]
van der Wilk, Mark [1 ]
Rasmussen, Carl E. [1 ]
机构
[1] Univ Cambridge, Cambridge, England
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014) | 2014年 / 27卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of research. We introduce a novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm. This is done by exploiting the decoupling of the data given the inducing points to re-formulate the evidence lower bound in a Map-Reduce setting. We show that the inference scales well with data and computational resources, while preserving a balanced distribution of the load among the nodes. We further demonstrate the utility in scaling Gaussian processes to big data. We show that GP performance improves with increasing amounts of data in regression (on flight data with 2 million records) and latent variable modelling (on MNIST). The results show that GPs perform better than many common models often used for big data.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] DISTRIBUTED VARIATIONAL INFERENCE-BASED HETEROSCEDASTIC GAUSSIAN PROCESS METAMODELING
    Wang, Wenjing
    Chen, Xi
    2019 WINTER SIMULATION CONFERENCE (WSC), 2019, : 380 - 391
  • [42] A Fixed-Point Operator for Inference in Variational Bayesian Latent Gaussian Models
    Sheth, Rishit
    Khardon, Roni
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 761 - 769
  • [43] Variational Inference for Sparse and Undirected Models
    Ingraham, John
    Marks, Debora
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [44] Imaging distributed sources with sparse ESM technique and Gaussian process regression
    Li, Jiangshuai
    Khilkevich, Victor
    He, Ruijie
    Liu, Yuanzhuo
    Zhou, Jiahao
    2021 JOINT IEEE INTERNATIONAL SYMPOSIUM ON ELECTROMAGNETIC COMPATIBILITY, SIGNAL & POWER INTEGRITY, AND EMC EUROPE (EMC+SIPI AND EMC EUROPE), 2021, : 23 - 28
  • [45] Probabilistic net load forecasting based on sparse variational Gaussian process regression
    Feng, Wentao
    Deng, Bingyan
    Chen, Tailong
    Zhang, Ziwen
    Fu, Yuheng
    Zheng, Yanxi
    Zhang, Le
    Jing, Zhiyuan
    FRONTIERS IN ENERGY RESEARCH, 2024, 12
  • [46] Stochastic variational inference for scalable non-stationary Gaussian process regression
    Ionut Paun
    Dirk Husmeier
    Colin J. Torney
    Statistics and Computing, 2023, 33
  • [47] Stochastic variational inference for scalable non-stationary Gaussian process regression
    Paun, Ionut
    Husmeier, Dirk
    Torney, Colin J.
    STATISTICS AND COMPUTING, 2023, 33 (02)
  • [48] Harmonized Multimodal Learning with Gaussian Process Latent Variable Models
    Song, Guoli
    Wang, Shuhui
    Huang, Qingming
    Tian, Qi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (03) : 858 - 872
  • [49] Classification of Streetsigns Using Gaussian Process Latent Variable Models
    Woeber, Wilfried
    Aburaia, Mohamed
    Olaverri-Monreal, Cristina
    2019 8TH IEEE INTERNATIONAL CONFERENCE ON CONNECTED VEHICLES AND EXPO (IIEEE CCVE), 2019,
  • [50] Craniofacial Reconstruction Using Gaussian Process Latent Variable Models
    Xiao, Zedong
    Zhao, Junli
    Qiao, Xuejun
    Duan, Fuqing
    COMPUTER ANALYSIS OF IMAGES AND PATTERNS, CAIP 2015, PT I, 2015, 9256 : 456 - 464