A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization

被引:0
作者
Kim, Yongjin [1 ]
Jong, Yunchol [1 ]
Kim, Yong [1 ]
机构
[1] Univ Sci, Dept Math, Unjong Dist 355, Pyongyang 950003, North Korea
关键词
unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search; QUASI-NEWTON METHODS; GLOBAL CONVERGENCE; DESCENT; ALGORITHM; PROPERTY;
D O I
10.21136/AM.2024.0204-23
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.
引用
收藏
页码:847 / 866
页数:20
相关论文
共 34 条
[31]   A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization [J].
Wang, Ting ;
Liu, Zexian ;
Liu, Hongwei .
INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2019, 96 (10) :1924-1942
[32]   Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point [J].
Nataj, Sarah ;
Lui, S. H. .
APPLIED MATHEMATICS AND COMPUTATION, 2020, 369
[33]   A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization [J].
Wumei Sun ;
Yufei Li ;
Ting Wang ;
Hongwei Liu .
Computational and Applied Mathematics, 2022, 41
[34]   New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction [J].
Dong, Xiao Liang ;
Liu, Hong Wei ;
He, Yu Bo .
APPLIED MATHEMATICS AND COMPUTATION, 2015, 269 :606-617