Yet another fast variant of Newton's method for nonconvex optimization

被引:1
|
作者
Gratton, Serge [1 ]
Jerad, Sadok [2 ]
Toint, Philippe L. [3 ]
机构
[1] Univ Toulouse, IRIT, INP, F-31000 Toulouse, France
[2] Univ Toulouse, IRIT, ANITI, INP, F-31000 Toulouse, France
[3] Univ Namur, NAXYS, B-5000 Namur, Belgium
关键词
Newton's method; nonconvex optimization; negative curvature; adaptive regularization methods; evaluation complexity; EXPLOITING NEGATIVE CURVATURE; CASE EVALUATION COMPLEXITY; CUBIC REGULARIZATION; LINE-SEARCH; DIRECTIONS; ALGORITHMS; DESCENT;
D O I
10.1093/imanum/drae021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A class of second-order algorithms is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps in an iteration-dependent subspace. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionally. Practical variants are detailed where the subspaces are chosen to be the full space, or Krylov subspaces. In the first case, the proposed method only requires the solution of a single linear system at nearly all iterations. We establish that at most $\mathcal{O}\big ( |\!\log \epsilon |\,\epsilon <^>{-3/2}\big )$ evaluations of the problem's objective function and derivatives are needed for algorithms in the new class to obtain an $\epsilon $-approximate first-order minimizer, and at most $\mathcal{O}\big (|\!\log \epsilon |\,\epsilon <^>{-3}\big )$ to obtain a second-order one. Encouraging initial numerical experiments with two full-space and two Krylov-subspaces variants are finally presented.
引用
收藏
页码:971 / 1008
页数:38
相关论文
共 50 条
  • [21] A filter line search algorithm based on an inexact Newton method for nonconvex equality constrained optimization
    Zhu-jun Wang
    De-tong Zhu
    Cun-yun Nie
    Acta Mathematicae Applicatae Sinica, English Series, 2017, 33 : 687 - 698
  • [22] A new variant of Newton's method based on equidistant nodes
    Zhang, Li
    Li, Song
    Yang, Yan
    Tan, Jieqing
    Journal of Information and Computational Science, 2013, 10 (16): : 5163 - 5170
  • [23] A New Variant of Newton's Method Based on Power Mean
    Ralevic, Nebojsa M.
    Lukic, Tibor
    2009 7TH INTERNATIONAL SYMPOSIUM ON INTELLIGENT SYSTEMS AND INFORMATICS, 2009, : 103 - 106
  • [24] Kantorovich's theorems for Newton's method for mappings and optimization problems on Lie groups
    Wang, Jin-Hua
    Li, Chong
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2011, 31 (01) : 322 - 347
  • [25] FAST QUADRATIC SENSING VIA NONCONVEX OPTIMIZATION
    Liu, Kaihui
    Wan, Liangtian
    Sun, Lu
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 2211 - 2215
  • [26] Quasi-Newton's method for multiobjective optimization
    Povalej, Ziga
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2014, 255 : 765 - 777
  • [27] A sub-sampled tensor method for nonconvex optimization
    Lucchi, Aurelien
    Kohler, Jonas
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2023, 43 (05) : 2856 - 2891
  • [28] Stochastic Gauss–Newton algorithm with STORM estimators for nonconvex composite optimization
    Zhaoxin Wang
    Bo Wen
    Journal of Applied Mathematics and Computing, 2022, 68 : 4621 - 4643
  • [29] Subgradient Method for Nonconvex Nonsmooth Optimization
    A. M. Bagirov
    L. Jin
    N. Karmitsa
    A. Al Nuaimat
    N. Sultanova
    Journal of Optimization Theory and Applications, 2013, 157 : 416 - 435
  • [30] Subgradient Method for Nonconvex Nonsmooth Optimization
    Bagirov, A. M.
    Jin, L.
    Karmitsa, N.
    Al Nuaimat, A.
    Sultanova, N.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2013, 157 (02) : 416 - 435