The performance of Krylov subspace eigenvalue algorithms for large matrices can be measured by the angle between a desired invariant subspace and the Krylov subspace. We develop general bounds for this convergence that include the effects of polynomial restarting and impose no restrictions concerning the diagonalizability of the matrix or its degree of nonnormality. Associated with a desired set of eigenvalues is a maximum "reachable invariant subspace" that can be developed from the given starting vector. Convergence for this distinguished subspace is bounded in terms involving a polynomial approximation problem. Elementary results from potential theory lead to convergence rate estimates and suggest restarting strategies based on optimal approximation points (e.g., Leja or Chebyshev points); exact shifts are evaluated within this framework. Computational examples illustrate the utility of these results. Origins of superlinear effects are also described.
机构:
Lund Univ, Ctr Math Sci, SE-22100 Lund, SwedenLund Univ, Ctr Math Sci, SE-22100 Lund, Sweden
Aleman, Alexandru
Baranov, Anton
论文数: 0引用数: 0
h-index: 0
机构:
St Petersburg State Univ, Dept Math & Mech, St Petersburg 199034, Russia
Natl Res Univ, Higher Sch Econ, St Petersburg, RussiaLund Univ, Ctr Math Sci, SE-22100 Lund, Sweden
Baranov, Anton
Belov, Yurii
论文数: 0引用数: 0
h-index: 0
机构:
St Petersburg State Univ, Chebyshev Lab, St Petersburg 199034, RussiaLund Univ, Ctr Math Sci, SE-22100 Lund, Sweden
机构:
Department of Mathematics, Ben Gurion University of Negev, P.O. Box 653, Beer-ShevaDepartment of Mathematics, Ben Gurion University of Negev, P.O. Box 653, Beer-Sheva
Gil M.
Acta Scientiarum Mathematicarum,
2016,
82
(1-2):
: 271
-
279