On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

被引:40
作者
de Klerk, Etienne [1 ,2 ]
Glineur, Francois [3 ,4 ]
Taylor, Adrien B. [3 ,4 ]
机构
[1] Tilburg Univ, Tilburg, Netherlands
[2] Delft Univ Technol, Delft, Netherlands
[3] UCL, CORE, Louvain La Neuve, Belgium
[4] ICTEAM, Louvain La Neuve, Belgium
关键词
Gradient method; Steepest descent; Semidefinite programming; Performance estimation problem;
D O I
10.1007/s11590-016-1087-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1-2):451-482, 2014).
引用
收藏
页码:1185 / 1199
页数:15
相关论文
共 11 条
[1]  
[Anonymous], 1999, Athena scientific Belmont
[2]  
[Anonymous], 2004, INTRO LECT CONVEX OP
[3]   Performance of first-order methods for smooth convex minimization: a novel approach [J].
Drori, Yoel ;
Teboulle, Marc .
MATHEMATICAL PROGRAMMING, 2014, 145 (1-2) :451-482
[4]  
Luenberger DG, 2016, INT SER OPER RES MAN, V228, P1, DOI 10.1007/978-3-319-18842-3
[5]  
Neelakantan A., 2015, ARXIV151106807V1
[6]  
NEMIROVSKI A, 1999, LECT NOTES
[7]  
Nemirovski A., 1983, WILEY INTERSCIENCE S
[8]  
Nocedal J, 2006, SPRINGER SER OPER RE, P1, DOI 10.1007/978-0-387-40065-5
[9]  
Polyak BT., 1987, INTRO OPTIMIZATION
[10]  
Taylor A.B., 2015, ARXIV151207516