A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data

被引:0
作者
Hoang, Trong Nghia [1 ]
Hoang, Quang Minh [1 ]
Low, Kian Hsiang [1 ]
机构
[1] Natl Univ Singapore, Dept Comp Sci, Singapore, Singapore
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37 | 2015年 / 37卷
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel unifying framework of anytime sparse Gaussian process regression (SGPR) models that can produce good predictive performance fast and improve their predictive performance over time. Our proposed unifying framework reverses the variational inference procedure to theoretically construct a non-trivial, concave functional that is maximized at the predictive distribution of any SGPR model of our choice. As a result, a stochastic natural gradient ascent method can be derived that involves iteratively following the stochastic natural gradient of the functional to improve its estimate of the predictive distribution of the chosen SGPR model and is guaranteed to achieve asymptotic convergence to it. Interestingly, we show that if the predictive distribution of the chosen SGPR model satisfies certain decomposability conditions, then the stochastic natural gradient is an unbiased estimator of the exact natural gradient and can be computed in constant time (i.e., independent of data size) at each iteration. We empirically evaluate the trade-off between the predictive performance vs. time efficiency of the anytime SGPR models on two real-world million-sized datasets.
引用
收藏
页码:569 / 578
页数:10
相关论文
共 50 条
[21]   Contraction rates for sparse variational approximations in Gaussian process regression [J].
Nieman, Dennis ;
Szabo, Botond ;
van Zanten, Harry .
Journal of Machine Learning Research, 2022, 23
[22]   Contraction rates for sparse variational approximations in Gaussian process regression [J].
Nieman, Dennis ;
Szabo, Botond ;
van Zanten, Harry .
JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
[23]   Variational Inference for Gaussian Process Models for Survival Analysis [J].
Kim, Minyoung ;
Pavlovic, Vladimir .
UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, :435-445
[24]   Variational Inference for Gaussian Process Models with Linear Complexity [J].
Cheng, Ching-An ;
Boots, Byron .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
[25]   A stochastic variational framework for Recurrent Gaussian Processes models [J].
Mattos, Cesar Lincoln C. ;
Barreto, Guilherme A. .
NEURAL NETWORKS, 2019, 112 :54-72
[26]   Fast Gaussian Process Regression for Big Data [J].
Das, Sourish ;
Roy, Sasanka ;
Sambasivan, Rajiv .
BIG DATA RESEARCH, 2018, 14 :12-26
[27]   Parametric Gaussian process regression for big data [J].
Maziar Raissi ;
Hessam Babaee ;
George Em Karniadakis .
Computational Mechanics, 2019, 64 :409-416
[28]   Parametric Gaussian process regression for big data [J].
Raissi, Maziar ;
Babaee, Hessam ;
Karniadakis, George Em .
COMPUTATIONAL MECHANICS, 2019, 64 (02) :409-416
[29]   Scalable Variational Bayesian Kernel Selection for Sparse Gaussian Process Regression [J].
Teng, Tong ;
Chen, Jie ;
Zhang, Yehong ;
Low, Kian Hsiang .
THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 :5997-6004
[30]   Probabilistic net load forecasting based on sparse variational Gaussian process regression [J].
Feng, Wentao ;
Deng, Bingyan ;
Chen, Tailong ;
Zhang, Ziwen ;
Fu, Yuheng ;
Zheng, Yanxi ;
Zhang, Le ;
Jing, Zhiyuan .
FRONTIERS IN ENERGY RESEARCH, 2024, 12