Yet another fast variant of Newton's method for nonconvex optimization

被引:1
|
作者
Gratton, Serge [1 ]
Jerad, Sadok [2 ]
Toint, Philippe L. [3 ]
机构
[1] Univ Toulouse, IRIT, INP, F-31000 Toulouse, France
[2] Univ Toulouse, IRIT, ANITI, INP, F-31000 Toulouse, France
[3] Univ Namur, NAXYS, B-5000 Namur, Belgium
关键词
Newton's method; nonconvex optimization; negative curvature; adaptive regularization methods; evaluation complexity; EXPLOITING NEGATIVE CURVATURE; CASE EVALUATION COMPLEXITY; CUBIC REGULARIZATION; LINE-SEARCH; DIRECTIONS; ALGORITHMS; DESCENT;
D O I
10.1093/imanum/drae021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A class of second-order algorithms is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps in an iteration-dependent subspace. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionally. Practical variants are detailed where the subspaces are chosen to be the full space, or Krylov subspaces. In the first case, the proposed method only requires the solution of a single linear system at nearly all iterations. We establish that at most $\mathcal{O}\big ( |\!\log \epsilon |\,\epsilon <^>{-3/2}\big )$ evaluations of the problem's objective function and derivatives are needed for algorithms in the new class to obtain an $\epsilon $-approximate first-order minimizer, and at most $\mathcal{O}\big (|\!\log \epsilon |\,\epsilon <^>{-3}\big )$ to obtain a second-order one. Encouraging initial numerical experiments with two full-space and two Krylov-subspaces variants are finally presented.
引用
收藏
页码:971 / 1008
页数:38
相关论文
共 50 条
  • [1] Fast Newton Hard Thresholding Pursuit for Sparsity Constrained Nonconvex Optimization
    Chen, Jinghui
    Gu, Quanquan
    KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, : 757 - 766
  • [2] ON THE COMPLEXITY OF STEEPEST DESCENT, NEWTON'S AND REGULARIZED NEWTON'S METHODS FOR NONCONVEX UNCONSTRAINED OPTIMIZATION PROBLEMS
    Cartis, C.
    Gould, N. I. M.
    Toint, Ph. L.
    SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (06) : 2833 - 2852
  • [3] A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization
    He, Chuan
    Huang, Heng
    Lu, Zhaosong
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 89 (03) : 843 - 894
  • [4] STOCHASTIC QUASI-NEWTON METHOD FOR NONCONVEX STOCHASTIC OPTIMIZATION
    Wang, Xiao
    Ma, Shiqian
    Goldfarb, Donald
    Liu, Wei
    SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (02) : 927 - 956
  • [5] A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization With Applications
    Chen, Huiming
    Wu, Ho-Chun
    Chan, Shing-Chow
    Lam, Wong-Hing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (11) : 4776 - 4790
  • [6] A variant of Newton's irrational method
    Wang, Xiuhua
    Gu, Chuanqing
    Kou, Jisheng
    PROCEEDINGS OF THE THIRD INTERNATIONAL WORKSHOP ON MATRIX ANALYSIS AND APPLICATIONS, VOL 3, 2009, : 1 - 4
  • [7] A SUPERQUADRATIC VARIANT OF NEWTON'S METHOD
    Potra, Florian A.
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2017, 55 (06) : 2863 - 2884
  • [8] Convergence of Perry and Shanno's Memoryless Quasi-Newton Method for Nonconvex Optimization Problems
    JI-YE HAN GUANG-HUI LIU HONG-XIA YIN(Institution of Applied Mathematics
    运筹学学报, 1997, (01) : 22 - 28
  • [9] On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization
    Milzarek, Andre
    Xiao, Xiantao
    Wen, Zaiwen
    Ulbrich, Michael
    SCIENCE CHINA-MATHEMATICS, 2022, 65 (10) : 2151 - 2170
  • [10] TRUST-REGION NEWTON-CG WITH STRONG SECOND-ORDER COMPLEXITY GUARANTEES FOR NONCONVEX OPTIMIZATION
    Curtis, Frank E.
    Robinson, Daniel P.
    Royer, Clement W.
    Wright, Stephen J.
    SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (01) : 518 - 544