A double parameter scaled BFGS method for unconstrained optimization

被引:20
作者
Andrei, Neculai [1 ]
机构
[1] Ctr Adv Modeling & Optimizat, Res Inst Informat, 8-10 Averescu Ave, Bucharest 1, Romania
关键词
Unconstrained optimization; Scaled BFGS method; Self-correcting quality; Global convergence; Numerical comparisons; QUASI-NEWTON METHODS; SUPERLINEAR CONVERGENCE; GLOBAL CONVERGENCE; ASCENT METHODS; LINE SEARCHES; ALGORITHMS; MINIMIZATION; GRADIENTS;
D O I
10.1016/j.cam.2017.10.009
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A double parameter scaled BFGS method for unconstrained optimization is presented. In this method, the first two terms of the known BFGS update formula are scaled with a positive parameter while the third one is scaled with another positive parameter. These parameters are selected in such a way as to improve the eigenvalues structure of the BFGS update. The parameter scaling the first two terms of the BFGS update is determined by clustering the eigenvalues of the scaled BFGS matrix. On the other hand, the parameter scaling the third term is determined as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition from conjugate gradient methods. Under the inexact Wolfe line search, the global convergence of the double parameter scaled BFGS method is proved in very general conditions without assuming the convexity of the minimizing function. Using 80 unconstrained optimization test functions with a medium number of variables, the preliminary numerical experiments show that this double parameter scaled BFGS method is more efficient than the standard BFGS update or than some other scaled BFGS methods. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:26 / 44
页数:19
相关论文
共 53 条
[1]   Numerical experience with a class of self-scaling quasi-Newton algorithms [J].
Al-Baali, M .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1998, 96 (03) :533-553
[2]   Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions [J].
Al-Baali, M .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 1998, 9 (02) :191-203
[3]  
Al-Baali M., 1993, TECHNICAL REPORT
[4]  
Andrei N., 2017, BUCHAREST 0502
[5]  
Andrei N., 2008, ADV MODEL OPTIM, V10, P147, DOI DOI 10.1021/es702781x
[6]  
Andrei N., 2017, NUMERICAL ALGORITHMS
[7]   Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization [J].
Andrei, Neculai .
OPTIMIZATION METHODS & SOFTWARE, 2017, 32 (03) :534-551
[8]   2-POINT STEP SIZE GRADIENT METHODS [J].
BARZILAI, J ;
BORWEIN, JM .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1988, 8 (01) :141-148
[9]  
Biggs M. C., 1971, Journal of the Institute of Mathematics and Its Applications, V8, P315
[10]  
Biggs M. C., 1973, Journal of the Institute of Mathematics and Its Applications, V12, P337