On the convergence of Krylov methods with low-rank truncations

被引:0
|
作者
Davide Palitta
Patrick Kürschner
机构
[1] Max Planck Institute for Dynamics of Complex Technical Systems,Research Group Computational Methods in Systems and Control Theory (CSC)
[2] Leipzig University of Applied Sciences (HTWK Leipzig),Centre for Mathematics and Natural Sciences
来源
Numerical Algorithms | 2021年 / 88卷
关键词
Linear matrix equations; Krylov subspace methods; Low-rank methods; Low-rank truncations; 65F10; 65F30; 15A06; 15A24;
D O I
暂无
中图分类号
学科分类号
摘要
Low-rank Krylov methods are one of the few options available in the literature to address the numerical solution of large-scale general linear matrix equations. These routines amount to well-known Krylov schemes that have been equipped with a couple of low-rank truncations to maintain a feasible storage demand in the overall solution procedure. However, such truncations may affect the convergence properties of the adopted Krylov method. In this paper we show how the truncation steps have to be performed in order to maintain the convergence of the Krylov routine. Several numerical experiments validate our theoretical findings.
引用
收藏
页码:1383 / 1417
页数:34
相关论文
共 50 条
  • [21] The rate of convergence for sparse and low-rank quantile trace regression
    Tan, Xiangyong
    Peng, Ling
    Xiao, Peiwen
    Liu, Qing
    Liu, Xiaohui
    JOURNAL OF COMPLEXITY, 2023, 79
  • [22] Convergence of projected subgradient method with sparse or low-rank constraints
    Xu, Hang
    Li, Song
    Lin, Junhong
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2024, 50 (04)
  • [23] An Improved Frequent Directions Algorithm for Low-Rank Approximation via Block Krylov Iteration
    Wang, Chenhao
    Yi, Qianxin
    Liao, Xiuwu
    Wang, Yao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 9428 - 9442
  • [24] Matrix-free Krylov iteration for implicit convolution of numerically low-rank data
    Breuer, Alex
    Lumsdaine, Andrew
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2016, 308 : 98 - 116
  • [25] Low-Rank Matrix Recovery with Scaled Subgradient Methods: Fast and Robust Convergence Without the Condition Number
    Tong, Tian
    Ma, Cong
    Chi, Yuejie
    2021 IEEE DATA SCIENCE AND LEARNING WORKSHOP (DSLW), 2021,
  • [26] Low-Rank Matrix Recovery With Scaled Subgradient Methods: Fast and Robust Convergence Without the Condition Number
    Tong, Tian
    Ma, Cong
    Chi, Yuejie
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 2396 - 2409
  • [27] Low-rank tensor methods for partial differential equations
    Bachmayr, Markus
    ACTA NUMERICA, 2023, 32 : 1 - 121
  • [28] Augmented low-rank methods for gaussian process regression
    Thomas, Emil
    Sarin, Vivek
    APPLIED INTELLIGENCE, 2022, 52 (02) : 1254 - 1267
  • [29] Sparse Bayesian Methods for Low-Rank Matrix Estimation
    Babacan, S. Derin
    Luessi, Martin
    Molina, Rafael
    Katsaggelos, Aggelos K.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (08) : 3964 - 3977
  • [30] Low-rank extragradient methods for scalable semidefinite optimization
    Garber, Dan
    Kaplan, Atara
    OPERATIONS RESEARCH LETTERS, 2025, 60