A Homogeneous Second-Order Descent Method for Nonconvex Optimization

被引:0
作者
Zhang, Chuwen [1 ]
He, Chang [1 ]
Jiang, Yuntian [1 ]
Xue, Chenyu [1 ]
Jiang, Bo [1 ,2 ,3 ]
Ge, Dongdong [4 ]
Ye, Yinyu [5 ]
机构
[1] Shanghai Univ Finance & Econ, Res Inst Interdisciplinary Sci, Sch Informat Management & Engn, Shanghai 200433, Peoples R China
[2] Shanghai Univ Finance & Econ, Key Lab Interdisciplinary Res Computat & Econ, Minist Educ, Shanghai 200433, Peoples R China
[3] Shanghai Univ Finance & Econ, Dishui Lake Adv Finance Inst, Shanghai 200120, Peoples R China
[4] Shanghai Jiao Tong Univ, Antai Coll Econ & Management, Shanghai 200030, Peoples R China
[5] Stanford Univ, Inst Computat & Math Engn, Palo Alto, CA 94305 USA
基金
中国国家自然科学基金;
关键词
nonconvex optimization; nonlinear programming; global convergence; inexact algorithm; CUBIC REGULARIZATION; NEWTON METHOD; COMPLEXITY; EIGENVALUE; SUBPROBLEM; ALGORITHMS; CG;
D O I
10.1287/moor.2023.0132
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we introduce a homogeneous second-order descent method (HSODM) motivated from the homogenization trick in quadratic programming. The merit of homogenization is that only the leftmost eigenvector of a gradient-Hessian integrated matrix is computed at each iteration. Therefore, the algorithm is a single-loop method that does not need to switch to other sophisticated algorithms and is easy to implement. We show that HSODM has a global convergence rate of O(epsilon-3=2) to find an epsilon-approximate second-order stationary point, and has a local quadratic convergence rate under the standard assumptions. The numerical results demonstrate the advantage of the proposed method over other second-order methods.
引用
收藏
页数:32
相关论文
共 45 条
[31]   REGULARIZED NEWTON METHOD WITH GLOBAL O (1/k2) CONVERGENCE [J].
Mishchenko, Konstantin .
SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) :1440-1462
[32]  
Mogensen PK, 2018, J. Open Res. Software, V3, P615, DOI [DOI 10.21105/JOSS.00615, 10.21105/joss.00615]
[33]  
Nesterov Y., 2018, Lectures on Convex Optimization. Springer Optimization and Its Applications, V2nd
[34]   Cubic regularization of Newton method and its global performance [J].
Nesterov, Yurii ;
Polyak, B. T. .
MATHEMATICAL PROGRAMMING, 2006, 108 (01) :177-205
[35]  
Nocedal J, 2006, SPRINGER SER OPER RE, P1, DOI 10.1007/978-0-387-40065-5
[36]  
Orban Dominique, 2019, Zenodo, DOI 10.5281/ZENODO.2655082
[37]  
Parlett B. N., 1998, The Symmetric Eigenvalue Problem
[38]   A new matrix-free algorithm for the large-scale trust-region subproblem [J].
Rojas, M ;
Santos, SA ;
Sorensen, DC .
SIAM JOURNAL ON OPTIMIZATION, 2001, 11 (03) :611-646
[39]   A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization [J].
Royer, Clement W. ;
O'Neill, Michael ;
Wright, Stephen J. .
MATHEMATICAL PROGRAMMING, 2020, 180 (1-2) :451-488
[40]   COMPLEXITY ANALYSIS OF SECOND-ORDER LINE-SEARCH ALGORITHMS FOR SMOOTH NONCONVEX OPTIMIZATION [J].
Royer, Clement W. ;
Wright, Stephen J. .
SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) :1448-1477