Cubic Regularization Methods with Second-Order Complexity Guarantee Based on a New Subproblem Reformulation

被引:4
作者
Jiang, Ru-Jun [1 ]
Zhou, Zhi-Shuo [1 ]
Zhou, Zi-Rui [2 ]
机构
[1] Fudan Univ, Sch Data Sci, Shanghai 200433, Peoples R China
[2] Huawei Technol Canada, Burnaby, BC, Canada
关键词
Cubic regularization subproblem; First-order methods; Constrained convex optimization; Complexity analysis; TRUST-REGION SUBPROBLEM; NONCONVEX; OPTIMIZATION; ALGORITHMS;
D O I
10.1007/s40305-022-00398-5
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The cubic regularization (CR) algorithm has attracted a lot of attentions in the literature in recent years. We propose a new reformulation of the cubic regularization subproblem. The reformulation is an unconstrained convex problem that requires computing the minimum eigenvalue of the Hessian. Then, based on this reformulation, we derive a variant of the (non-adaptive) CR provided a known Lipschitz constant for the Hessian and a variant of adaptive regularization with cubics (ARC). We show that the iteration complexity of our variants matches the best-known bounds for unconstrained minimization algorithms using first- and second-order information. Moreover, we show that the operation complexity of both of our variants also matches the state-of-the-art bounds in the literature. Numerical experiments on test problems from CUTEst collection show that the ARC based on our new subproblem reformulation is comparable to the existing algorithms.
引用
收藏
页码:471 / 506
页数:36
相关论文
共 30 条
[1]   Finding Approximate Local Minima Faster than Gradient Descent [J].
Agarwal, Naman ;
Allen-Zhu, Zeyuan ;
Bullins, Brian ;
Hazan, Elad ;
Ma, Tengyu .
STOC'17: PROCEEDINGS OF THE 49TH ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2017, :1195-1199
[2]   2-POINT STEP SIZE GRADIENT METHODS [J].
BARZILAI, J ;
BORWEIN, JM .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1988, 8 (01) :141-148
[3]   Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models [J].
Birgin, E. G. ;
Gardenghi, J. L. ;
Martinez, J. M. ;
Santos, S. A. ;
Toint, Ph. L. .
MATHEMATICAL PROGRAMMING, 2017, 163 (1-2) :359-368
[4]  
Carmon Y, 2018, ADV NEUR IN, V31
[5]   First-Order Methods for Nonconvex Quadratic Minimization [J].
Carmon, Yair ;
Duchi, John C. .
SIAM REVIEW, 2020, 62 (02) :395-436
[6]   GRADIENT DESCENT FINDS THE CUBIC-REGULARIZED NONCONVEX NEWTON STEP [J].
Carmon, Yair ;
Duchi, John .
SIAM JOURNAL ON OPTIMIZATION, 2019, 29 (03) :2146-2178
[7]   ACCELERATED METHODS FOR NONCONVEX OPTIMIZATION [J].
Carmon, Yair ;
Duchi, John C. ;
Hinder, Oliver ;
Sidford, Aaron .
SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) :1751-1772
[8]   Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity [J].
Cartis, Coralia ;
Gould, Nicholas I. M. ;
Toint, Philippe L. .
MATHEMATICAL PROGRAMMING, 2011, 130 (02) :295-319
[9]   Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results [J].
Cartis, Coralia ;
Gould, Nicholas I. M. ;
Toint, Philippe L. .
MATHEMATICAL PROGRAMMING, 2011, 127 (02) :245-295
[10]   TRUST-REGION NEWTON-CG WITH STRONG SECOND-ORDER COMPLEXITY GUARANTEES FOR NONCONVEX OPTIMIZATION [J].
Curtis, Frank E. ;
Robinson, Daniel P. ;
Royer, Clement W. ;
Wright, Stephen J. .
SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (01) :518-544