A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data

被引:0
作者
Hoang, Trong Nghia [1 ]
Hoang, Quang Minh [1 ]
Low, Kian Hsiang [1 ]
机构
[1] Natl Univ Singapore, Dept Comp Sci, Singapore, Singapore
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37 | 2015年 / 37卷
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel unifying framework of anytime sparse Gaussian process regression (SGPR) models that can produce good predictive performance fast and improve their predictive performance over time. Our proposed unifying framework reverses the variational inference procedure to theoretically construct a non-trivial, concave functional that is maximized at the predictive distribution of any SGPR model of our choice. As a result, a stochastic natural gradient ascent method can be derived that involves iteratively following the stochastic natural gradient of the functional to improve its estimate of the predictive distribution of the chosen SGPR model and is guaranteed to achieve asymptotic convergence to it. Interestingly, we show that if the predictive distribution of the chosen SGPR model satisfies certain decomposability conditions, then the stochastic natural gradient is an unbiased estimator of the exact natural gradient and can be computed in constant time (i.e., independent of data size) at each iteration. We empirically evaluate the trade-off between the predictive performance vs. time efficiency of the anytime SGPR models on two real-world million-sized datasets.
引用
收藏
页码:569 / 578
页数:10
相关论文
共 50 条
[31]   Robust Sparse Gaussian Process Regression for Soft Sensing in Industrial Big Data Under the Outlier Condition [J].
Huang, Haojie ;
Peng, Xin ;
Du, Wei ;
Zhong, Weimin .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 :1-11
[32]   Inference for odds ratio regression models with sparse dependent data [J].
Hanfelt, JJ ;
Liang, KY .
BIOMETRICS, 1998, 54 (01) :136-147
[33]   Variational Inference for Stochastic Block Models From Sampled Data [J].
Tabouy, Tinnothee ;
Barbillon, Pierre ;
Chiquet, Julien .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2020, 115 (529) :455-466
[34]   Subset-of-Data Variational Inference for Deep Gaussian-Processes Regression [J].
Jain, Ayush ;
Srijith, P. K. ;
Khan, Mohammad Emtiyaz .
UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 161, 2021, 161 :1362-1370
[35]   CLASSIFICATION OF MULTIPLE ANNOTATOR DATA USING VARIATIONAL GAUSSIAN PROCESS INFERENCE [J].
Besler, Emre ;
Ruiz, Pablo ;
Molina, Rafael ;
Katsaggelos, Aggelos K. .
2016 24TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2016, :2025-2029
[36]   Understanding and comparing scalable Gaussian process regression for big data [J].
Liu, Haitao ;
Cai, Jianfei ;
Ong, Yew-Soon ;
Wang, Yi .
KNOWLEDGE-BASED SYSTEMS, 2019, 164 (324-335) :324-335
[37]   Conditional variational autoencoder with Gaussian process regression recognition for parametric models [J].
Zhang, Xuehan ;
Jiang, Lijian .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2024, 438
[38]   Inference of Kinetics in Population Balance Models using Gaussian Process Regression [J].
Busschaert, Michiel ;
Waldherr, Stefen .
IFAC PAPERSONLINE, 2022, 55 (07) :384-391
[39]   Pointwise uncertainty quantification for sparse variational Gaussian process regression with a Brownian motion prior [J].
Travis, Luke ;
Ray, Kolyan .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
[40]   Variational Bayesian Inference for Quantile Regression Models with Nonignorable Missing Data [J].
Li, Xiaoning ;
Tuerde, Mulati ;
Hu, Xijian .
MATHEMATICS, 2023, 11 (18)