Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format

被引:6
|
作者
Rohrbach, Paul B. [1 ]
Dolgov, Sergey [2 ]
Grasedyck, Lars [3 ]
Scheichl, Robert [4 ]
机构
[1] Univ Cambridge, Dept Appl Math & Theoret Phys, Wilberforce Rd, Cambridge CB3 0WA, England
[2] Univ Bath, Dept Math Sci, Claverton Down, Bath BA2 7AY, Avon, England
[3] Rhein Westfal TH Aachen, Inst Geomet & Prakt Math, Templergraben 55, D-52056 Aachen, Germany
[4] Heidelberg Univ, Inst Appl Math, Neuenheimer Feld 205, D-69120 Heidelberg, Germany
来源
基金
英国工程与自然科学研究理事会;
关键词
tensor; Tensor-Train; high-dimensional; low rank; Gaussian probability distribution; DECOMPOSITION; ALGORITHM; TT;
D O I
10.1137/20M1314653
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Low-rank tensor approximations have shown great potential for uncertainty quantification in high dimensions, for example, to build surrogate models that can be used to speed up large-scale inference problems [M. Eigel, M. Marschall, and R. Schneider, Inverse Problems, 34 (2018), 035010; S. Dolgov et al., Stat. Comput., 30 (2020), pp. 603-625]. The feasibility and efficiency of such approaches depends critically on the rank that is necessary to represent or approximate the underlying distribution. In this paper, a priori rank bounds for approximations in the functional Tensor-Train representation for the case of Gaussian models are developed. It is shown that under suitable conditions on the precision matrix, the Gaussian density can be approximated to high accuracy without suffering from an exponential growth of complexity as the dimension increases. These results provide a rigorous justification of the suitability and the limitations of low-rank tensor methods in a simple but important model case. Numerical experiments confirm that the rank bounds capture the qualitative behavior of the rank structure when varying the parameters of the precision matrix and the accuracy of the approximation. Finally, the practical relevance of the theoretical results is demonstrated in the context of a Bayesian filtering problem.
引用
收藏
页码:1191 / 1224
页数:34
相关论文
共 50 条
  • [1] RANDOMIZED ALGORITHMS FOR ROUNDING IN THE TENSOR-TRAIN FORMAT
    Al Daas, Hussam
    Ballard, Grey
    Cazeaux, Paul
    Hallman, Eric
    Miedlar, Agnieszka
    Pasha, Mirjeta
    Reid, Tim W.
    Saibaba, Arvind K.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (01): : A74 - A95
  • [2] A Riemannian Rank-Adaptive Method for Higher-Order Tensor Completion in the Tensor-Train Format
    Vermeylen, Charlotte
    Van Barel, Marc
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2025, 32 (01)
  • [3] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    Journal of Machine Learning Research, 2022, 23
  • [4] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23 : 1 - 77
  • [5] Tensor-Train Kernel Learning for Gaussian Processes
    Kirstein, Max
    Sommer, David
    Eigel, Martin
    CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, VOL 179, 2022, 179
  • [6] Robust Tensor Tracking With Missing Data Under Tensor-Train Format
    Le Trung Thanh
    Abed-Meraim, Karim
    Nguyen Linh Trung
    Hafiane, Adel
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 832 - 836
  • [7] Nearest-neighbor interaction systems in the tensor-train format
    Gelss, Patrick
    Klus, Stefan
    Matera, Sebastian
    Schuette, Christof
    JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 341 : 140 - 162
  • [8] Subspace Methods with Local Refinements for Eigenvalue Computation Using Low-Rank Tensor-Train Format
    Zhang, Junyu
    Wen, Zaiwen
    Zhang, Yin
    JOURNAL OF SCIENTIFIC COMPUTING, 2017, 70 (02) : 478 - 499
  • [9] Challenging the Curse of Dimensionality in Multidimensional Numerical Integration by Using a Low-Rank Tensor-Train Format
    Alexandrov, Boian
    Manzini, Gianmarco
    Skau, Erik W.
    Truong, Phan Minh Duc
    Vuchov, Radoslav G.
    MATHEMATICS, 2023, 11 (03)
  • [10] Subspace Methods with Local Refinements for Eigenvalue Computation Using Low-Rank Tensor-Train Format
    Junyu Zhang
    Zaiwen Wen
    Yin Zhang
    Journal of Scientific Computing, 2017, 70 : 478 - 499