SPARSE CHOLESKY FACTORIZATION BY KULLBACK-LEIBLER MINIMIZATION

被引:45
作者
Schaefer, Florian [1 ]
Katzfuss, Matthias [2 ]
Owhadi, Houman [1 ]
机构
[1] CALTECH, Comp & Math Sci, Pasadena, CA 91125 USA
[2] Texas A&M Univ, Dept Stat, College Stn, TX 77843 USA
基金
美国国家科学基金会;
关键词
Cholesky factorization; screening effect; Vecchia approximation; factorized approximate inverse; Gaussian process regression; integral equation; FAST MULTIPOLE METHOD; EFFICIENT; PRECONDITIONER; APPROXIMATION; INTERPOLATION; MODELS;
D O I
10.1137/20M1336254
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We propose to compute a sparse approximate inverse Cholesky factor L of a dense covariance matrix circle minus by minimizing the Kullback-Leibler divergence between the Gaussian distributions N (0, circle minus) and N (0, L-inverted perpendicular L-1), subject to a sparsity constraint. Surprisingly, this problem has a closed-form solution that can be computed efficiently, recovering the popular Vecchia approximation in spatial statistics. Based on recent results on the approximate sparsity of inverse Cholesky factors of circle minus obtained from pairwise evaluation of Green's functions of elliptic boundary-value problems at points {x(i)} 1 <= i <= N subset of R-d, we propose an elimination ordering and sparsity pattern that allows us to compute epsilon -approximate inverse Cholesky factors of such circle minus in computational complexity O (N log(N/epsilon)(d)) in space and O (N log(N/epsilon)(2d)) in time. To the best of our knowledge, this is the best asymptotic complexity for this class of problems. Furthermore, our method is embarrassingly parallel, automatically exploits low-dimensional structure in the data, and can perform Gaussian-process regression in linear (in N) space complexity. Motivated by its optimality properties, we propose applying our method to the joint covariance of training and prediction points in Gaussian-process regression, greatly improving stability and computational cost. Finally, we show how to apply our method to the important setting of Gaussian processes with additive noise, compromising neither accuracy nor computational complexity.
引用
收藏
页码:A2019 / A2046
页数:28
相关论文
共 66 条
  • [1] Fast Direct Methods for Gaussian Processes
    Ambikasaran, Sivaram
    Foreman-Mackey, Daniel
    Greengard, Leslie
    Hogg, David W.
    O'Neil, Michael
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2016, 38 (02) : 252 - 265
  • [2] Ambikasaran S, 2013, J SCI COMPUT, V57, P477, DOI 10.1007/s10915-013-9714-z
  • [3] [Anonymous], 2004, FAST STABLE ALGORITH
  • [4] Kernel independent component analysis
    Bach, FR
    Jordan, MI
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (01) : 1 - 48
  • [5] Stationary process approximation for the analysis of large spatial datasets
    Banerjee, Sudipto
    Gelfand, Alan E.
    Finley, Andrew O.
    Sang, Huiyan
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2008, 70 : 825 - 848
  • [6] Screening Effect in Isotropic Gaussian Processes
    Bao, Jing Yu
    Ye, Fei
    Yang, Ying
    [J]. ACTA MATHEMATICA SINICA-ENGLISH SERIES, 2020, 36 (05) : 512 - 534
  • [7] Baptista R., 2020, PREPRINT
  • [8] A comparative study of sparse approximate inverse preconditioners
    Benzi, M
    Tuma, M
    [J]. APPLIED NUMERICAL MATHEMATICS, 1999, 30 (2-3) : 305 - 340
  • [9] FAST WAVELET TRANSFORMS AND NUMERICAL ALGORITHMS .1.
    BEYLKIN, G
    COIFMAN, R
    ROKHLIN, V
    [J]. COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 1991, 44 (02) : 141 - 183
  • [10] PRECONDITIONED KRYLOV SUBSPACE METHODS FOR SAMPLING MULTIVARIATE GAUSSIAN DISTRIBUTIONS
    Chow, Edmond
    Saad, Yousef
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2014, 36 (02) : A588 - A608