A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data

被引:0
|
作者
Hoang, Trong Nghia [1 ]
Hoang, Quang Minh [1 ]
Low, Kian Hsiang [1 ]
机构
[1] Natl Univ Singapore, Dept Comp Sci, Singapore, Singapore
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37 | 2015年 / 37卷
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel unifying framework of anytime sparse Gaussian process regression (SGPR) models that can produce good predictive performance fast and improve their predictive performance over time. Our proposed unifying framework reverses the variational inference procedure to theoretically construct a non-trivial, concave functional that is maximized at the predictive distribution of any SGPR model of our choice. As a result, a stochastic natural gradient ascent method can be derived that involves iteratively following the stochastic natural gradient of the functional to improve its estimate of the predictive distribution of the chosen SGPR model and is guaranteed to achieve asymptotic convergence to it. Interestingly, we show that if the predictive distribution of the chosen SGPR model satisfies certain decomposability conditions, then the stochastic natural gradient is an unbiased estimator of the exact natural gradient and can be computed in constant time (i.e., independent of data size) at each iteration. We empirically evaluate the trade-off between the predictive performance vs. time efficiency of the anytime SGPR models on two real-world million-sized datasets.
引用
收藏
页码:569 / 578
页数:10
相关论文
共 50 条
  • [1] A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models
    Trong Nghia Hoang
    Quang Minh Hoang
    Low, Bryan Kian Hsiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [2] Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression
    Yu, Haibin
    Trong Nghia Hoang
    Low, Bryan Kian Hsiang
    Jaillet, Patrick
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [3] Variational inference for sparse spectrum Gaussian process regression
    Tan, Linda S. L.
    Ong, Victor M. H.
    Nott, David J.
    Jasra, Ajay
    STATISTICS AND COMPUTING, 2016, 26 (06) : 1243 - 1261
  • [4] Variational inference for sparse spectrum Gaussian process regression
    Linda S. L. Tan
    Victor M. H. Ong
    David J. Nott
    Ajay Jasra
    Statistics and Computing, 2016, 26 : 1243 - 1261
  • [5] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [6] Sparse Variational Inference for Generalized Gaussian Process Models
    Sheth, Rishit
    Wang, Yuyang
    Khardon, Roni
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1302 - 1311
  • [7] A Generalized Stochastic Variational Bayesian Hyperparameter Learning Framework for Sparse Spectrum Gaussian Process Regression
    Quang Minh Hoang
    Trong Nghia Hoang
    Low, Kian Hsiang
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2007 - 2014
  • [8] Convergence of Sparse Variational Inference in Gaussian Processes Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [9] Convergence of sparse variational inference in gaussian processes regression
    Burt, David R.
    Rasmussen, Carl Edward
    Van Der Wilk, Mark
    Journal of Machine Learning Research, 2020, 21
  • [10] Stochastic variational inference for scalable non-stationary Gaussian process regression
    Ionut Paun
    Dirk Husmeier
    Colin J. Torney
    Statistics and Computing, 2023, 33