A Second-Order Finite-Difference Method for Derivative-Free Optimization

被引:1
作者
Chen, Qian [1 ]
Wang, Peng [1 ,2 ]
Zhu, Detong [3 ]
机构
[1] Hainan Normal Univ, Math & Stat Coll, Haikou 570203, Hainan, Peoples R China
[2] Hainan Normal Univ, Key Lab, Minist Educ, Haikou 570203, Hainan, Peoples R China
[3] Shanghai Normal Univ, Math & Sci Coll, Shanghai 200234, Peoples R China
基金
中国国家自然科学基金;
关键词
GRADIENT; CONVEX;
D O I
10.1155/2024/1947996
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this paper, a second-order finite-difference method is proposed for finding the second-order stationary point of derivative-free nonconvex unconstrained optimization problems. The forward-difference or the central-difference technique is used to approximate the gradient and Hessian matrix of objective function, respectively. The traditional trust-region framework is used, and we minimize the approximation trust region subproblem to obtain the search direction. The global convergence of the algorithm is given without the fully quadratic assumption. Numerical results show the effectiveness of the algorithm using the forward-difference and central-difference approximations.
引用
收藏
页数:12
相关论文
共 28 条
  • [1] [Anonymous], 2008, Nonsmooth Optimization via BFGS
  • [2] Interior gradient and proximal methods for convex and conic optimization
    Auslender, A
    Teboulle, M
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2006, 16 (03) : 697 - 725
  • [3] Projected subgradient methods with non-Euclidean distances for non-differentiable convex minimization and variational inequalities
    Auslender, Alfred
    Teboulle, Marc
    [J]. MATHEMATICAL PROGRAMMING, 2009, 120 (01) : 27 - 48
  • [4] The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case
    Bellavia, Stefania
    Gurioli, Gianmarco
    Morini, Benedetta
    Toint, Philippe Louis
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2023, 196 (02) : 700 - 729
  • [5] Berahas AS, 2021, Arxiv, DOI arXiv:1905.01332
  • [6] DERIVATIVE-FREE OPTIMIZATION OF NOISY FUNCTIONS VIA QUASI-NEWTON METHODS
    Berahas, Albert S.
    Byrd, Richard H.
    Nocedal, Jorge
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2019, 29 (02) : 965 - 993
  • [7] Gradient estimation schemes for noisy functions
    Brekelmans, RCM
    Driessen, LT
    Hamers, HJM
    Den Hertog, D
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2005, 126 (03) : 529 - 551
  • [8] Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity
    Cartis, Coralia
    Gould, Nicholas I. M.
    Toint, Philippe L.
    [J]. MATHEMATICAL PROGRAMMING, 2011, 130 (02) : 295 - 319
  • [9] Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results
    Cartis, Coralia
    Gould, Nicholas I. M.
    Toint, Philippe L.
    [J]. MATHEMATICAL PROGRAMMING, 2011, 127 (02) : 245 - 295
  • [10] Clarke F. H., 1983, Canad. Math. Soc. Ser. Monogr. Adv. Texts