Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization

被引:59
作者
Andrei, Neculai [1 ]
机构
[1] Ctr Adv Modeling & Optimizat, Res Inst Informat, Bucharest 1, Romania
关键词
unconstrained optimization; conjugate gradient method; spectral gradient method; Wolfe line search; BFGS preconditioning;
D O I
10.1080/10556780600822260
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
A scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale-Powell restart criterion holds. The parameter scaling the gradient is selected as the spectral gradient. In very mild conditions, it is shown that, for strongly convex functions, the algorithm is globally convergent. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new scaled conjugate gradient algorithm substantially outperforms the known conjugate gradient methods including the spectral conjugate gradient by Birgin and Martinez [Birgin, E. and Martinez, J.M., 2001, A spectral conjugate gradient method for unconstrained optimization. Applied Mathematics and Optimization, 43, 117-128], the conjugate gradient by Polak and Ribiere [Polak, E. and Ribiere, G., 1969, Note sur la convergence de methodes de directions conjuguees. Revue Francaise Informat. Reserche Operationnelle, 16, 35-43], as well as the most recent conjugate gradient method with guaranteed descent by Hager and Zhang [Hager, W.W. and Zhang, H., 2005, A new conjugate gradient method with guaranteed descent and an efficient line search.
引用
收藏
页码:561 / 571
页数:11
相关论文
共 20 条
[1]   A spectral conjugate gradient method for unconstrained optimization [J].
Birgin, EG ;
Martínez, JM .
APPLIED MATHEMATICS AND OPTIMIZATION, 2001, 43 (02) :117-128
[2]   CUTE - CONSTRAINED AND UNCONSTRAINED TESTING ENVIRONMENT [J].
BONGARTZ, I ;
CONN, AR ;
GOULD, N ;
TOINT, PL .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 1995, 21 (01) :123-160
[3]   Convergence properties of Beale-Powell restart algorithm [J].
Dai, YH ;
Yuan, YX .
SCIENCE IN CHINA SERIES A-MATHEMATICS, 1998, 41 (11) :1142-1150
[4]   New conjugacy conditions and related nonlinear conjugate gradient methods [J].
Dai, YH ;
Liao, LZ .
APPLIED MATHEMATICS AND OPTIMIZATION, 2001, 43 (01) :87-101
[5]   FUNCTION MINIMIZATION BY CONJUGATE GRADIENTS [J].
FLETCHER, R ;
REEVES, CM .
COMPUTER JOURNAL, 1964, 7 (02) :149-&
[6]   Algorithm 851: CG DESCENT, a conjugate gradient method with guaranteed descent [J].
Hager, WW ;
Zhang, HC .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2006, 32 (01) :113-137
[7]   A new conjugate gradient method with guaranteed descent and an efficient line search [J].
Hager, WW ;
Zhang, HC .
SIAM JOURNAL ON OPTIMIZATION, 2005, 16 (01) :170-192
[8]   METHODS OF CONJUGATE GRADIENTS FOR SOLVING LINEAR SYSTEMS [J].
HESTENES, MR ;
STIEFEL, E .
JOURNAL OF RESEARCH OF THE NATIONAL BUREAU OF STANDARDS, 1952, 49 (06) :409-436
[9]   SELF-SCALING VARIABLE METRIC (SSVM) ALGORITHMS .1. CRITERIA AND SUFFICIENT CONDITIONS FOR SCALING A CLASS OF ALGORITHMS [J].
OREN, SS ;
LUENBERGER, DG .
MANAGEMENT SCIENCE SERIES A-THEORY, 1974, 20 (05) :845-862
[10]   OPTIMAL CONDITIONING OF SELF-SCALING VARIABLE METRIC ALGORITHMS [J].
OREN, SS ;
SPEDICATO, E .
MATHEMATICAL PROGRAMMING, 1976, 10 (01) :70-90