Stochastic variational inference for scalable non-stationary Gaussian process regression

被引:2
|
作者
Paun, Ionut [1 ]
Husmeier, Dirk [1 ]
Torney, Colin J. [1 ]
机构
[1] Univ Glasgow, Sch Math & Stat, Glasgow City G12 8QQ, Scotland
基金
英国工程与自然科学研究理事会;
关键词
Approximate Bayesian inference; Variational inference; Machine learning; Large-scale data; Gaussian process; Non-stationary;
D O I
10.1007/s11222-023-10210-w
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussian processes, an approach where the parameters of the covariance kernel are allowed to vary in time or space. The non-stationary GP is a flexible model that relaxes the strong prior assumption of standard GP regression, that the covariance properties of the inferred functions are constant across the input space. Non-stationary GPs typically model varying covariance kernel parameters as further lower-level GPs, thereby enabling sampling-based inference. However, due to the high computational costs and inherently sequential nature of MCMC sampling, these methods do not scale to large datasets. Here we develop a variational inference approach to fitting non-stationary GPs that combines sparse GP regression methods with a trajectory segmentation technique. Our method is scalable to large datasets containing potentially millions of data points. We demonstrate the effectiveness of our approach on both synthetic and real world datasets.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models
    Trong Nghia Hoang
    Quang Minh Hoang
    Low, Bryan Kian Hsiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [42] Convergence of Sparse Variational Inference in Gaussian Processes Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [43] Convergence of sparse variational inference in gaussian processes regression
    Burt, David R.
    Rasmussen, Carl Edward
    Van Der Wilk, Mark
    Journal of Machine Learning Research, 2020, 21
  • [44] gptools: Scalable Gaussian Process Inference with Stan
    Hoffmann, Till
    Onnela, Jukka-Pekka
    JOURNAL OF STATISTICAL SOFTWARE, 2025, 112 (02): : 1 - 31
  • [45] Inference of Functional Divergence Among Proteins When the Evolutionary Process is Non-stationary
    Rachael A. Bay
    Joseph P. Bielawski
    Journal of Molecular Evolution, 2013, 76 : 205 - 215
  • [46] Inference of Functional Divergence Among Proteins When the Evolutionary Process is Non-stationary
    Bay, Rachael A.
    Bielawski, Joseph P.
    JOURNAL OF MOLECULAR EVOLUTION, 2013, 76 (04) : 205 - 215
  • [47] Stochastic analysis of the least mean fourth algorithm for non-stationary white Gaussian inputs
    Eweda Eweda
    Neil J. Bershad
    Jose C. M. Bermudez
    Signal, Image and Video Processing, 2014, 8 : 133 - 142
  • [48] Doubly Stochastic Variational Inference for Deep Gaussian Processes
    Salimbeni, Hugh
    Deisenroth, Marc Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [49] STREAMING INFERENCE FOR INFINITE NON-STATIONARY CLUSTERING
    Schaeffer, Rylan
    Liu, Gabrielle Kaili-May
    Du, Yilun
    Linderman, Scott
    Fiete, Ila Rani
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199
  • [50] Stochastic analysis of the least mean fourth algorithm for non-stationary white Gaussian inputs
    Eweda, Eweda
    Bershad, Neil J.
    Bermudez, Jose C. M.
    SIGNAL IMAGE AND VIDEO PROCESSING, 2014, 8 (01) : 133 - 142