Preconditioned low-rank methods for high-dimensional elliptic PDE eigenvalue problems

被引:0
|
作者
Kressner D. [1 ]
Tobler C. [2 ]
机构
[1] Department of Numerical Algorithms and HPC, MATHICSE, EPF Lausanne
[2] Seminar for Applied Mathematics, ETH Zurich, CH-8092 Zürich
关键词
ALS; DMRG; High-dimensional PDE eigenvalue problems; LOBPCG; Low-rank tensor methods;
D O I
10.2478/cmam-2011-0020
中图分类号
学科分类号
摘要
We consider elliptic PDE eigenvalue problems on a tensorized domain, discretized such that the resulting matrix eigenvalue problem Ax = λ x exhibits Kronecker product structure. In particular, we are concerned with the case of high dimensions, where standard approaches to the solution of matrix eigenvalue problems fail due to the exponentially growing degrees of freedom. Recent work shows that this curse of dimensionality can in many cases be addressed by approximating the desired solution vector x in a low-rank tensor format. In this paper, we use the hierarchical Tucker decomposition to develop a low-rank variant of LOBPCG, a classical preconditioned eigenvalue solver. We also show how the ALS and MALS (DMRG) methods known from computational quantum physics can be adapted to the hierarchical Tucker decomposition. Finally, a combination of ALS and MALS with LOBPCG and with our low-rank variant is proposed. A number of numerical experiments indicate that such combinations represent the methods of choice. © 2011 Institute of Mathematics, National Academy of Sciences.
引用
收藏
页码:363 / 381
页数:18
相关论文
共 50 条
  • [1] On low-rank approximability of solutions to high-dimensional operator equations and eigenvalue problems
    Kressner, Daniel
    Uschmajew, Andre
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2016, 493 : 556 - 572
  • [2] Projection Methods for Dynamical Low-Rank Approximation of High-Dimensional Problems
    Kieri, Emil
    Vandereycken, Bart
    COMPUTATIONAL METHODS IN APPLIED MATHEMATICS, 2019, 19 (01) : 73 - 92
  • [3] Low-Rank Bandit Methods for High-Dimensional Dynamic Pricing
    Mueller, Jonas
    Syrgkanis, Vasilis
    Taddy, Matt
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] LOW-RANK SOLUTION METHODS FOR STOCHASTIC EIGENVALUE PROBLEMS
    Elman, Howard C.
    Su, Tengfei
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2019, 41 (04): : A2657 - A2680
  • [5] Reconstruction of a high-dimensional low-rank matrix
    Yata, Kazuyoshi
    Aoshima, Makoto
    ELECTRONIC JOURNAL OF STATISTICS, 2016, 10 (01): : 895 - 917
  • [6] ESTIMATION OF HIGH-DIMENSIONAL LOW-RANK MATRICES
    Rohde, Angelika
    Tsybakov, Alexandre B.
    ANNALS OF STATISTICS, 2011, 39 (02): : 887 - 930
  • [7] High-dimensional VAR with low-rank transition
    Alquier, Pierre
    Bertin, Karine
    Doukhan, Paul
    Garnier, Remy
    STATISTICS AND COMPUTING, 2020, 30 (04) : 1139 - 1153
  • [8] High-dimensional VAR with low-rank transition
    Pierre Alquier
    Karine Bertin
    Paul Doukhan
    Rémy Garnier
    Statistics and Computing, 2020, 30 : 1139 - 1153
  • [9] Low-rank Riemannian eigensolver for high-dimensional Hamiltonians
    Rakhuba, Maxim
    Novikov, Alexander
    Oseledets, Ivan
    JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 396 : 718 - 737
  • [10] LOW-RANK TENSOR METHODS WITH SUBSPACE CORRECTION FOR SYMMETRIC EIGENVALUE PROBLEMS
    Kressner, Daniel
    Steinlechner, Michael
    Uschmajew, Andre
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2014, 36 (05): : A2346 - A2368