A Homogeneous Second-Order Descent Method for Nonconvex Optimization

被引:0
作者
Zhang, Chuwen [1 ]
He, Chang [1 ]
Jiang, Yuntian [1 ]
Xue, Chenyu [1 ]
Jiang, Bo [1 ,2 ,3 ]
Ge, Dongdong [4 ]
Ye, Yinyu [5 ]
机构
[1] Shanghai Univ Finance & Econ, Res Inst Interdisciplinary Sci, Sch Informat Management & Engn, Shanghai 200433, Peoples R China
[2] Shanghai Univ Finance & Econ, Key Lab Interdisciplinary Res Computat & Econ, Minist Educ, Shanghai 200433, Peoples R China
[3] Shanghai Univ Finance & Econ, Dishui Lake Adv Finance Inst, Shanghai 200120, Peoples R China
[4] Shanghai Jiao Tong Univ, Antai Coll Econ & Management, Shanghai 200030, Peoples R China
[5] Stanford Univ, Inst Computat & Math Engn, Palo Alto, CA 94305 USA
基金
中国国家自然科学基金;
关键词
nonconvex optimization; nonlinear programming; global convergence; inexact algorithm; CUBIC REGULARIZATION; NEWTON METHOD; COMPLEXITY; EIGENVALUE; SUBPROBLEM; ALGORITHMS; CG;
D O I
10.1287/moor.2023.0132
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we introduce a homogeneous second-order descent method (HSODM) motivated from the homogenization trick in quadratic programming. The merit of homogenization is that only the leftmost eigenvector of a gradient-Hessian integrated matrix is computed at each iteration. Therefore, the algorithm is a single-loop method that does not need to switch to other sophisticated algorithms and is easy to implement. We show that HSODM has a global convergence rate of O(epsilon-3=2) to find an epsilon-approximate second-order stationary point, and has a local quadratic convergence rate under the standard assumptions. The numerical results demonstrate the advantage of the proposed method over other second-order methods.
引用
收藏
页数:32
相关论文
共 45 条
[1]   SOLVING THE TRUST-REGION SUBPROBLEM BY A GENERALIZED EIGENVALUE PROBLEM [J].
Adachi, Satoru ;
Iwata, Satoru ;
Nakatsukasa, Yuji ;
Takeda, Akiko .
SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (01) :269-291
[2]   Finding Approximate Local Minima Faster than Gradient Descent [J].
Agarwal, Naman ;
Allen-Zhu, Zeyuan ;
Bullins, Brian ;
Hazan, Elad ;
Ma, Tengyu .
STOC'17: PROCEEDINGS OF THE 49TH ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2017, :1195-1199
[3]   Julia: A Fresh Approach to Numerical Computing [J].
Bezanson, Jeff ;
Edelman, Alan ;
Karpinski, Stefan ;
Shah, Viral B. .
SIAM REVIEW, 2017, 59 (01) :65-98
[4]   ACCELERATED METHODS FOR NONCONVEX OPTIMIZATION [J].
Carmon, Yair ;
Duchi, John C. ;
Hinder, Oliver ;
Sidford, Aaron .
SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) :1751-1772
[5]  
Cartis C, 2022, MOS-SIAM SER OPTIMIZ, P1, DOI 10.1137/1.9781611976991
[6]   ON THE COMPLEXITY OF STEEPEST DESCENT, NEWTON'S AND REGULARIZED NEWTON'S METHODS FOR NONCONVEX UNCONSTRAINED OPTIMIZATION PROBLEMS [J].
Cartis, C. ;
Gould, N. I. M. ;
Toint, Ph. L. .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (06) :2833-2852
[7]   Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity [J].
Cartis, Coralia ;
Gould, Nicholas I. M. ;
Toint, Philippe L. .
MATHEMATICAL PROGRAMMING, 2011, 130 (02) :295-319
[8]   Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results [J].
Cartis, Coralia ;
Gould, Nicholas I. M. ;
Toint, Philippe L. .
MATHEMATICAL PROGRAMMING, 2011, 127 (02) :245-295
[9]  
Conn A. R., 2000, MPS-SIAM Series on Optimization, DOI [10.1137/1.9780898719857, DOI 10.1137/1.9780898719857]
[10]   WORST-CASE COMPLEXITY OF TRACE WITH INEXACT SUBPROBLEM SOLUTIONS FOR NONCONVEX SMOOTH OPTIMIZATION [J].
Curtis, Frank E. ;
Wang, Qi .
SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) :2191-2221