A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization

被引:0
作者
Ming Li
Hongwei Liu
Zexian Liu
机构
[1] Xidian University,College of Mathematics and Statistics
[2] Hezhou University,School of Mathematics and Computer Science
来源
Numerical Algorithms | 2018年 / 79卷
关键词
Conjugate gradient method; Nonmonotone line search; Subspace minimization; Unconstrained optimization; Global convergence; 90C30; 90C06; 65K05;
D O I
暂无
中图分类号
学科分类号
摘要
A new subspace minimization conjugate gradient algorithm with a nonmonotone Wolfe line search is proposed and analyzed. In the scheme, we propose two choices of the search direction by minimizing a quadratic approximation of the objective function in special subspaces, and state criterions on how to choose the direction. Under given conditions, we obtain the significant conclusion that each choice of the direction satisfies the sufficient descent property. Based on the idea on how the function is close to a quadratic function, a new strategy for choosing the initial stepsize is presented for the line search. With the used nonmonotone Wolfe line search, we prove the global convergence of the proposed method for general nonlinear functions under mild assumptions. Numerical comparisons are given with well-known CGOPT and CG_DESCENT and show that the proposed algorithm is very promising.
引用
收藏
页码:195 / 219
页数:24
相关论文
共 64 条
[1]  
Andrei N(2008)An unconstrained optimization test functions collection Adv. Model. Optim. 10 147-161
[2]  
Andrei N(2010)Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization Eur. J. Oper. Res. 204 410-420
[3]  
Andrei N(2014)An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization Numer. Algorithm. 65 859-874
[4]  
Andrei N(2015)A new three-term conjugate gradient algorithm for unconstrained optimization Numer. Algorithm. 219 6316-6327
[5]  
Babaie-Kafaki S(2015)A hybridization of the Hestenestiefel and DaiYuan conjugate gradient methods based on a least-squares approach Optim. Methods Softw. 30 673-681
[6]  
Ghanbari R(1988)Two-point step size gradient methods IMA J. Numer. Anal. 8 141-148
[7]  
Barzilai J(2006)Subspace trust-region methods for large bound-constrained nonlinear equations SIAM J. Numer. Anal. 44 1535-1555
[8]  
Borwein JM(1999)A subspace, interior, and conjugate gradient method for large-scale bound-constrained minimization problems SIAM J. Sci. Comput. 21 1-23
[9]  
Bellavia S(2016)Krylov-subspace recycling via the POD-augmented conjugate gradient method SIAM J. Matrix Anal. Appl. 37 1304-1336
[10]  
Morini B(2001)New conjugacy conditions and related nonlinear conjugate gradient methods Appl. Math. Optim. 43 87-101