LINEAR CONVERGENCE OF DESCENT METHODS FOR THE UNCONSTRAINED MINIMIZATION OF RESTRICTED STRONGLY CONVEX FUNCTIONS

被引:15
作者
Schoepfer, Frank [1 ]
机构
[1] Carl von Ossietzky Univ Oldenburg, Inst Math, D-26111 Oldenburg, Germany
关键词
linear convergence; restricted strong convexity; error bound; quadratic splines; BFGS; conjugate gradient; CONJUGATE-GRADIENT METHOD; QUASI-NEWTON METHODS; MEMORY BFGS METHOD; GLOBAL CONVERGENCE; ERROR-BOUNDS; QUADRATIC PROGRAMS; NONSMOOTH ANALYSIS; BREGMAN ITERATION; SINGULAR-VALUES; ALGORITHM;
D O I
10.1137/140992990
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Linear convergence rates of descent methods for unconstrained minimization are usually proved under the assumption that the objective function is strongly convex. Recently it was shown that the weaker assumption of restricted strong convexity suffices for linear convergence of the ordinary gradient descent method. A decisive difference to strong convexity is that the set of minimizers of a restricted strongly convex function need be neither a singleton nor bounded. In this paper we extend the linear convergence results under this weaker assumption to a larger class of descent methods including restarted nonlinear CG, BFGS, and its damped limited memory variants, L-D-BFGS. For twice continuously differentiable objective functions we even obtain r-step superlinear convergence for the CG DESCENT conjugate gradient method of Hager and Zhang, where r is greater than or equal to the rank of the Hessian at a minimizer. This is remarkable since the Hessian of a restricted strongly convex function need not have full rank. Furthermore we show that convex quadratic splines and objective functions of the unconstrained duals to some linearly constrained optimization problems are restricted strongly convex. In particular this holds for the regularized basis pursuit problem and its analogues for nuclear norm minimization and principal component pursuit.
引用
收藏
页码:1883 / 1911
页数:29
相关论文
共 66 条
[11]   On the characterization of quadratic splines [J].
Chen, BT ;
Madsen, K ;
Zhang, S .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2005, 124 (01) :93-111
[12]   Atomic decomposition by basis pursuit [J].
Chen, SSB ;
Donoho, DL ;
Saunders, MA .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) :33-61
[13]   Global convergence of a two-parameter family of conjugate gradient methods without line search [J].
Chen, XD ;
Sun, J .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2002, 146 (01) :37-45
[14]   RATE OF CONVERGENCE OF SEVERAL CONJUGATE GRADIENT ALGORITHMS [J].
COHEN, AI .
SIAM JOURNAL ON NUMERICAL ANALYSIS, 1972, 9 (02) :248-&
[15]   LINEAR CONVERGENCE OF CONJUGATE GRADIENT METHOD [J].
CROWDER, H ;
WOLFE, P .
IBM JOURNAL OF RESEARCH AND DEVELOPMENT, 1972, 16 (04) :431-&
[16]  
Dai Y. H., 2011, WILEY ENCY OPER RES
[17]   A nonlinear conjugate gradient method with a strong global convergence property [J].
Dai, YH ;
Yuan, Y .
SIAM JOURNAL ON OPTIMIZATION, 1999, 10 (01) :177-182
[18]  
Dai Yuhong, 2002, Journal of Systems Science and Complexity, V15, P139
[19]   QUASI-NEWTON METHODS, MOTIVATION AND THEORY [J].
DENNIS, JE ;
MORE, JJ .
SIAM REVIEW, 1977, 19 (01) :46-89
[20]  
Elad M, 2010, SPARSE AND REDUNDANT REPRESENTATIONS, P3, DOI 10.1007/978-1-4419-7011-4_1