Quantum gradient descent for linear systems and least squares

被引:70
作者
Kerenidis, Iordanis [1 ,2 ]
Prakash, Anupam [1 ]
机构
[1] Univ Paris Diderot, CNRS, IRIF, Paris, France
[2] Natl Univ Singapore, Ctr Quantum Technol, Singapore, Singapore
关键词
ALGORITHMS;
D O I
10.1103/PhysRevA.101.022316
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Quantum machine learning and optimization are exciting new areas that have been brought forward by the breakthrough quantum algorithm of Harrow, Hassidim, and Lloyd for solving systems of linear equations. The utility of classical linear system solvers extends beyond linear algebra as they can be leveraged to solve optimization problems using iterative methods like gradient descent. In this work, we provide a quantum method for performing gradient descent when the gradient is an affine function. Performing tau steps of the gradient descent requires time O(tau C-S) for weighted least-squares problems, where C-S is the cost of performing one step of the gradient descent quantumly, which at times can be considerably smaller than the classical cost. We illustrate our method by providing two applications: first, for solving positive semidefinite linear systems, and, second, for performing stochastic gradient descent for the weighted least-squares problem with reduced quantum memory requirements. We also provide a quantum linear system solver in the QRAM data structure model that provides significant savings in cost for large families of matrices.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Weighted least squares fitting using ordinary least squares algorithms
    Kiers, HAL
    [J]. PSYCHOMETRIKA, 1997, 62 (02) : 251 - 266
  • [22] Weighted least squares fitting using ordinary least squares algorithms
    Henk A. L. Kiers
    [J]. Psychometrika, 1997, 62 : 251 - 266
  • [23] Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
    Kobayashi, Michiya
    Narushima, Yasushi
    Yabe, Hiroshi
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2010, 234 (02) : 375 - 397
  • [24] GPU parameter tuning for tall and skinny dense linear least squares problems
    Sauk, Benjamin
    Ploskas, Nikolaos
    Sahinidis, Nikolaos
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2020, 35 (03) : 638 - 660
  • [25] The State-of-the-Art of Preconditioners for Sparse Linear Least-Squares Problems
    Gould, Nicholas
    Scott, Jennifer
    [J]. ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2017, 43 (04):
  • [26] Communication efficient distributed weighted non-linear least squares estimation
    Sahu, Anit Kumar
    Jakovetic, Dusan
    Bajovic, Dragana
    Kar, Soummya
    [J]. EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2018,
  • [27] ITERATIVE REWEIGHTED LINEAR LEAST SQUARES FOR EXACT PENALTY SUBPROBLEMS ON PRODUCT SETS
    Burke, James V.
    Curtis, Frank E.
    Wang, Hao
    Wang, Jiashan
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (01) : 261 - 294
  • [28] Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate
    Kuemmerle, Christian
    Verdun, Claudio Mayrink
    Stoeger, Dominik
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [29] On accurate error estimates for the quaternion least squares and weighted least squares problems
    Li, Ying
    Wei, Musheng
    Zhang, Fengxia
    Zhao, Jianli
    [J]. INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2020, 97 (08) : 1662 - 1677
  • [30] SURPASSING GRADIENT DESCENT PROVABLY: A CYCLIC INCREMENTAL METHOD WITH LINEAR CONVERGENCE RATE
    Mokhtari, Aryan
    Gurbuzbalaban, Mert
    Ribeiro, Alejandro
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) : 1420 - 1447