A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization

被引:21
作者
Babaie-Kafaki, Saman [1 ,2 ]
机构
[1] Semnan Univ, Dept Math, Fac Math Stat & Comp Sci, Semnan, Iran
[2] Inst Res Fundamental Sci IPM, Sch Math, Tehran, Iran
来源
4OR-A QUARTERLY JOURNAL OF OPERATIONS RESEARCH | 2013年 / 11卷 / 04期
关键词
Unconstrained optimization; Scaled conjugate gradient method; Modified secant equation; Sufficient descent condition; Global convergence; QUASI-NEWTON METHODS; ALGORITHM; CONVERGENCE; PERFORMANCE; EQUATION;
D O I
10.1007/s10288-013-0233-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martinez are hybridized following Andrei's approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and More.
引用
收藏
页码:361 / 374
页数:14
相关论文
共 34 条
[1]   A scaled nonlinear conjugate gradient algorithm for unconstrained optimization [J].
Andrei, Neculai .
OPTIMIZATION, 2008, 57 (04) :549-570
[2]  
Andrei N, 2007, COMPUT OPTIM APPL, V38, P401, DOI [10.1007/s10589-007-9055-7, 10.1007/S10589-007-9055-7]
[3]   Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization [J].
Andrei, Neculai .
OPTIMIZATION METHODS & SOFTWARE, 2007, 22 (04) :561-571
[4]   A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization [J].
Andrei, Neculai .
APPLIED MATHEMATICS LETTERS, 2007, 20 (06) :645-650
[5]   Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization [J].
Andrei, Neculai .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2010, 204 (03) :410-420
[6]   A Quadratic Hybridization of Polak-RibiSre-Polyak and Fletcher-Reeves Conjugate Gradient Methods [J].
Babaie-Kafaki, Saman .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2012, 154 (03) :916-932
[7]   A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei [J].
Babaie-Kafaki, Saman .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2012, 52 (02) :409-414
[8]   A modified BFGS algorithm based on a hybrid secant equation [J].
Babaie-Kafaki, Saman .
SCIENCE CHINA-MATHEMATICS, 2011, 54 (09) :2019-2036
[9]   Two new conjugate gradient methods based on modified secant equations [J].
Babaie-Kafaki, Saman ;
Ghanbari, Reza ;
Mandavi-Amiri, Nezam .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2010, 234 (05) :1374-1386
[10]   2-POINT STEP SIZE GRADIENT METHODS [J].
BARZILAI, J ;
BORWEIN, JM .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1988, 8 (01) :141-148