Yet another fast variant of Newton's method for nonconvex optimization

被引:1
|
作者
Gratton, Serge [1 ]
Jerad, Sadok [2 ]
Toint, Philippe L. [3 ]
机构
[1] Univ Toulouse, IRIT, INP, F-31000 Toulouse, France
[2] Univ Toulouse, IRIT, ANITI, INP, F-31000 Toulouse, France
[3] Univ Namur, NAXYS, B-5000 Namur, Belgium
关键词
Newton's method; nonconvex optimization; negative curvature; adaptive regularization methods; evaluation complexity; EXPLOITING NEGATIVE CURVATURE; CASE EVALUATION COMPLEXITY; CUBIC REGULARIZATION; LINE-SEARCH; DIRECTIONS; ALGORITHMS; DESCENT;
D O I
10.1093/imanum/drae021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A class of second-order algorithms is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps in an iteration-dependent subspace. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionally. Practical variants are detailed where the subspaces are chosen to be the full space, or Krylov subspaces. In the first case, the proposed method only requires the solution of a single linear system at nearly all iterations. We establish that at most $\mathcal{O}\big ( |\!\log \epsilon |\,\epsilon <^>{-3/2}\big )$ evaluations of the problem's objective function and derivatives are needed for algorithms in the new class to obtain an $\epsilon $-approximate first-order minimizer, and at most $\mathcal{O}\big (|\!\log \epsilon |\,\epsilon <^>{-3}\big )$ to obtain a second-order one. Encouraging initial numerical experiments with two full-space and two Krylov-subspaces variants are finally presented.
引用
收藏
页码:971 / 1008
页数:38
相关论文
共 50 条
  • [41] The regularized feasible directions method for nonconvex optimization
    Beck, Amir
    Hallak, Nadav
    OPERATIONS RESEARCH LETTERS, 2022, 50 (05) : 517 - 523
  • [42] Fast and stable nonconvex constrained distributed optimization: the ELLADA algorithm
    Tang, Wentao
    Daoutidis, Prodromos
    OPTIMIZATION AND ENGINEERING, 2022, 23 (01) : 259 - 301
  • [43] On q-Newton's method for unconstrained multiobjective optimization problems
    Mishra, Shashi Kant
    Panda, Geetanjali
    Ansary, Md Abu Talhamainuddin
    Ram, Bhagwat
    JOURNAL OF APPLIED MATHEMATICS AND COMPUTING, 2020, 63 (1-2) : 391 - 410
  • [44] On q-Newton’s method for unconstrained multiobjective optimization problems
    Shashi Kant Mishra
    Geetanjali Panda
    Md Abu Talhamainuddin Ansary
    Bhagwat Ram
    Journal of Applied Mathematics and Computing, 2020, 63 : 391 - 410
  • [45] NEWTON'S METHOD FOR INTERVAL-VALUED MULTIOBJECTIVE OPTIMIZATION PROBLEM
    Upadhyay, Balendu bhooshan
    Pandey, Rupesh krishna
    Liao, Shanli
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2024, 20 (04) : 1633 - 1661
  • [46] Fast and stable nonconvex constrained distributed optimization: the ELLADA algorithm
    Wentao Tang
    Prodromos Daoutidis
    Optimization and Engineering, 2022, 23 : 259 - 301
  • [47] A cubic regularization of Newton's method with finite difference Hessian approximations
    Grapiglia, G. N.
    Goncalves, M. L. N.
    Silva, G. N.
    NUMERICAL ALGORITHMS, 2022, 90 (02) : 607 - 630
  • [48] Newton-Stein Method: An Optimization Method for GLMs via Stein's Lemma
    Erdogdu, Murat A.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [49] A variant of Newton's method based on Simpson's three-eighths rule for nonlinear equations
    Chen, Jen-Yuan
    Kincaid, David R.
    Lin, Bei-Ru
    APPLIED MATHEMATICS LETTERS, 2018, 79 : 1 - 5
  • [50] Yet another Bayesian active learning reliability analysis method
    Dang, Chao
    Zhou, Tong
    Valdebenito, Marcos A.
    Faes, Matthias G. R.
    STRUCTURAL SAFETY, 2025, 112