Rates of Convergence for Sparse Variational Gaussian Process Regression

被引:0
作者
Burt, David R. [1 ]
Rasmussen, Carl Edward [1 ,2 ]
van der Wilk, Mark [2 ]
机构
[1] Univ Cambridge, Cambridge, England
[2] PROWLER Io, Cambridge, England
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97 | 2019年 / 97卷
关键词
APPROXIMATION; MATRIX;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Excellent variational approximations to Gaussian process posteriors have been developed which avoid the O (N-3) scaling with dataset size N. They reduce the computational cost to O (NM2), with M << N the number of inducing variables, which summarise the process. While the computational cost seems to be linear in N, the true complexity of the algorithm depends on how M must increase to ensure a certain quality of approximation. We show that with high probability the KL divergence can be made arbitrarily small by growing M more slowly than N. A particular case is that for regression with normally distributed inputs in D-dimensions with the Squared Exponential kernel, M = (9(log(D) N) suffices. Our results show that as datasets grow, Gaussian process posteriors can be approximated cheaply, and provide a concrete rule for how to increase M in continual learning scenarios.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] SPARSE MULTIRESOLUTION REGRESSION FOR UNCERTAINTY PROPAGATION
    Schiavazzi, Daniele
    Doostan, Alireza
    Iaccarino, Gianluca
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2014, 4 (04) : 303 - 331
  • [42] On the convergence of alternating minimization methods in variational PGD
    El Hamidi, A.
    Ossman, H.
    Jazar, M.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2017, 68 (02) : 455 - 472
  • [43] Convergence theorems of solutions of a generalized variational inequality
    Yu, Li
    Liang, Ma
    FIXED POINT THEORY AND APPLICATIONS, 2011,
  • [44] On the Support Recovery of Jointly Sparse Gaussian Sources via Sparse Bayesian Learning
    Khanna, Saurabh
    Murthy, Chandra R.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (11) : 7361 - 7378
  • [45] CONVERGENCE RATES TO THE ARCSINE LAW
    Borisov, I. S.
    Shefer, E. I.
    THEORY OF PROBABILITY AND ITS APPLICATIONS, 2023, 68 (02) : 175 - 197
  • [46] A model for damage and failure of carbon-carbon composites: development and identification through Gaussian process regression
    Airoldi, Alessandro
    Novembre, Edoardo
    Mirani, Chiara
    Gianotti, Giacomo
    Passoni, Raffaello
    Cantoni, Carlo
    MATERIALS TODAY COMMUNICATIONS, 2023, 35
  • [47] Variational Bayesian Sparse Signal Recovery With LSM Prior
    Zhang, Shuanghui
    Liu, Yongxiang
    Li, Xiang
    Bi, Guoan
    IEEE ACCESS, 2017, 5 : 26690 - 26702
  • [48] Convergence in Total Variation to a Mixture of Gaussian Laws
    Pratelli, Luca
    Rigo, Pietro
    MATHEMATICS, 2018, 6 (06):
  • [49] Prediction of Hardness and Fracture Toughness in Liquid-Phase Sintered Alumina System Using Gaussian Process Regression and Minimax Probability Machine Regression
    Gopinath, K. G. S.
    Pal, Soumen
    Tambe, Pankaj
    MATERIALS TODAY-PROCEEDINGS, 2018, 5 (05) : 12223 - 12232
  • [50] A note on sparse least-squares regression
    Boutsidis, Christos
    Magdon-Ismail, Malik
    INFORMATION PROCESSING LETTERS, 2014, 114 (05) : 273 - 276