An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization

被引:40
作者
Andrei, Neculai [1 ]
机构
[1] Ctr Adv Modeling & Optimizat, Res Inst Informat, Bucharest 1, Romania
关键词
Unconstrained optimization; Three-term conjugate gradient; Descent condition; Conjugacy condition; Numerical comparisons; GLOBAL CONVERGENCE; ASCENT METHODS; PROPERTY; STORAGE;
D O I
10.1007/s11075-013-9718-7
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A three-term conjugate gradient algorithm for large-scale unconstrained optimization using subspace minimizing technique is presented. In this algorithm the search directions are computed by minimizing the quadratic approximation of the objective function in a subspace spanned by the vectors: -g(k+1), s(k) and y(k) . The search direction is considered as: d (k+1) = -g (k+1) + a(k)s(k) + b(k)y(k) , where the scalars a(k) and b(k) are determined by minimization the affine quadratic approximate of the objective function. The step-lengths are determined by the Wolfe line search conditions. We prove that the search directions are descent and satisfy the Dai-Liao conjugacy condition. The suggested algorithm is of three-term conjugate gradient type, for which both the descent and the conjugacy conditions are guaranteed. It is shown that, for uniformly convex functions, the directions generated by the algorithm are bounded above, i.e. the algorithm is convergent. The numerical experiments, for a set of 750 unconstrained optimization test problems, show that this new algorithm substantially outperforms the known Hestenes and Stiefel, Dai and Liao, Dai and Yuan and Polak, Ribiere and Poliak conjugate gradient algorithms, as well as the limited memory quasi-Newton method L-BFGS and the discrete truncated-Newton method TN.
引用
收藏
页码:859 / 874
页数:16
相关论文
共 38 条
[1]  
Al-Bayati A.Y., 2010, CAN J SCI ENG MATH, V1, P108
[2]  
Andrei N., 2008, Adv. Model. Optim, V10, P147
[3]   An acceleration of gradient descent algorithm with backtracking for unconstrained optimization [J].
Andrei, Neculai .
NUMERICAL ALGORITHMS, 2006, 42 (01) :63-73
[4]   A modified Polak-Ribiere-Polyak conjugate gradient algorithm for unconstrained optimization [J].
Andrei, Neculai .
OPTIMIZATION, 2011, 60 (12) :1457-1471
[5]   Acceleration of conjugate gradient algorithms for unconstrained optimization [J].
Andrei, Neculai .
APPLIED MATHEMATICS AND COMPUTATION, 2009, 213 (02) :361-369
[6]  
[Anonymous], 1987, PRACTICAL METHODS OP
[7]  
Beale E.M. L., 1972, Numerical Methods for Nonlinear Optimization, P39
[8]   A two-term PRP-based descent method [J].
Cheng, Wanyou .
NUMERICAL FUNCTIONAL ANALYSIS AND OPTIMIZATION, 2007, 28 (11-12) :1217-1230
[9]  
Con A.R., 1996, LINEAR NONLINEAR CON, P50
[10]   New conjugacy conditions and related nonlinear conjugate gradient methods [J].
Dai, YH ;
Liao, LZ .
APPLIED MATHEMATICS AND OPTIMIZATION, 2001, 43 (01) :87-101