An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property

被引:0
作者
Zabidin Salleh
Ahmad Alhawarat
机构
[1] Universiti Malaysia Terengganu,School of Informatics and Applied Mathematics
来源
Journal of Inequalities and Applications | / 2016卷
关键词
conjugate gradient method; Wolfe-Powell line search; Hestenes-Stiefel formula; restart condition; performance profile;
D O I
暂无
中图分类号
学科分类号
摘要
The conjugate gradient (CG) method is one of the most popular methods to solve nonlinear unconstrained optimization problems. The Hestenes-Stiefel (HS) CG formula is considered one of the most efficient methods developed in this century. In addition, the HS coefficient is related to the conjugacy condition regardless of the line search method used. However, the HS parameter may not satisfy the global convergence properties of the CG method with the Wolfe-Powell line search if the descent condition is not satisfied. In this paper, we use the original HS CG formula with a mild condition to construct a CG method with restart using the negative gradient. The convergence and descent properties with the strong Wolfe-Powell (SWP) and weak Wolfe-Powell (WWP) line searches are established. Using this condition, we guarantee that the HS formula is non-negative, its value is restricted, and the number of restarts is not too high. Numerical computations with the SWP line search and some standard optimization problems demonstrate the robustness and efficiency of the new version of the CG parameter in comparison with the latest and classical CG formulas. An example is used to describe the benefit of using different initial points to obtain different solutions for multimodal optimization functions.
引用
收藏
相关论文
共 39 条
  • [1] Hestenes MR(1952)Methods of conjugate gradients for solving linear systems J. Res. Natl. Bur. Stand. 49 409-436
  • [2] Stiefel E(1964)Function minimization by conjugate gradients Comput. J. 7 149-154
  • [3] Fletcher R(1969)Note sur la convergence de méthodes de directions conjuguées ESAIM: Math. Model. Numer. Anal. 3 35-43
  • [4] Reeves CM(1968)Convergence conditions for ascent methods SIAM Rev. 11 226-235
  • [5] Polak E(1971)Convergence conditions for ascent methods. II: some corrections SIAM Rev. 13 185-188
  • [6] Ribiere G(2001)New conjugacy conditions and related nonlinear conjugate gradient methods Appl. Math. Optim. 43 87-101
  • [7] Wolfe P(1992)Global convergence properties of conjugate gradient methods for optimization SIAM J. Optim. 2 21-42
  • [8] Wolfe P(1970)Nonlinear programming, computational methods Integer Nonlinear Program. 143 37-86
  • [9] Dai Y-H(1990)Efficient hybrid conjugate gradient techniques J. Optim. Theory Appl. 64 379-397
  • [10] Liao L-Z(2006)The convergence properties of some new conjugate gradient methods Appl. Math. Comput. 183 1341-1350