Quantum gradient descent for linear systems and least squares

被引:81
作者
Kerenidis, Iordanis [1 ,2 ]
Prakash, Anupam [1 ]
机构
[1] Univ Paris Diderot, CNRS, IRIF, Paris, France
[2] Natl Univ Singapore, Ctr Quantum Technol, Singapore, Singapore
关键词
ALGORITHMS;
D O I
10.1103/PhysRevA.101.022316
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Quantum machine learning and optimization are exciting new areas that have been brought forward by the breakthrough quantum algorithm of Harrow, Hassidim, and Lloyd for solving systems of linear equations. The utility of classical linear system solvers extends beyond linear algebra as they can be leveraged to solve optimization problems using iterative methods like gradient descent. In this work, we provide a quantum method for performing gradient descent when the gradient is an affine function. Performing tau steps of the gradient descent requires time O(tau C-S) for weighted least-squares problems, where C-S is the cost of performing one step of the gradient descent quantumly, which at times can be considerably smaller than the classical cost. We illustrate our method by providing two applications: first, for solving positive semidefinite linear systems, and, second, for performing stochastic gradient descent for the weighted least-squares problem with reduced quantum memory requirements. We also provide a quantum linear system solver in the QRAM data structure model that provides significant savings in cost for large families of matrices.
引用
收藏
页数:18
相关论文
共 50 条
[31]   On accurate error estimates for the quaternion least squares and weighted least squares problems [J].
Li, Ying ;
Wei, Musheng ;
Zhang, Fengxia ;
Zhao, Jianli .
INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2020, 97 (08) :1662-1677
[32]   SURPASSING GRADIENT DESCENT PROVABLY: A CYCLIC INCREMENTAL METHOD WITH LINEAR CONVERGENCE RATE [J].
Mokhtari, Aryan ;
Gurbuzbalaban, Mert ;
Ribeiro, Alejandro .
SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) :1420-1447
[33]   On the best convergence factors of iterative methods of matrix equations based on the gradient and least squares searches [J].
Zhang, Huamin ;
Yin, Hongcai .
PROCEEDINGS OF THE 36TH CHINESE CONTROL CONFERENCE (CCC 2017), 2017, :111-115
[34]   Signal parameter estimation through hierarchical conjugate gradient least squares applied to tensor decomposition [J].
Liu, Long ;
Wang, Ling ;
Xie, Jian ;
Wang, Yuexian ;
Zhang, Zhaolin .
ETRI JOURNAL, 2020, 42 (06) :922-931
[35]   Gradient learning in a classification setting by gradient descent [J].
Cai, Jia ;
Wang, Hongyan ;
Zhou, Ding-Xuan .
JOURNAL OF APPROXIMATION THEORY, 2009, 161 (02) :674-692
[36]   Matrix strategies for computing the least trimmed squares estimation of the general linear and SUR models [J].
Hofmann, Marc ;
Kontoghiorghes, Erricos John .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2010, 54 (12) :3392-3403
[37]   Tilted Least-Squares Parameter Estimation of Linear Regression Models in the Presence of Outliers [J].
Mu, Biqiang ;
Bai, Er-Wei ;
Zheng, Wei Xing .
2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, :2464-2470
[38]   Updating QR factorization procedure for solution of linear least squares problem with equality constraints [J].
Zeb, Salman ;
Yousaf, Muhammad .
JOURNAL OF INEQUALITIES AND APPLICATIONS, 2017,
[39]   SPARSE STRETCHING FOR SOLVING SPARSE-DENSE LINEAR LEAST-SQUARES PROBLEMS [J].
Scott, Jennifer A. ;
Tuma, Miroslav .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2019, 41 (03) :A1604-A1625
[40]   Asymptotics for Sketching in Least Squares [J].
Dobriban, Edgar ;
Liu, Sifan .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32