A Hybrid Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan Algorithms

被引:0
|
作者
Andrei, Neculai [1 ,2 ]
机构
[1] ICI Ctr Adv Modeling & Optimizat, Natl Inst R&D Informat, Bucharest 1, Romania
[2] Acad Romanian Scientists, Bucharest 5, Romania
来源
STUDIES IN INFORMATICS AND CONTROL | 2008年 / 17卷 / 04期
关键词
Unconstrained optimization; hybrid conjugate gradient method; Newton direction; numerical comparisons;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Another hybrid conjugate gradient algorithm is suggested in this paper. The parameter beta(k) is computed as a convex combination of beta(HS)(k) (Hestenes-Stiefel) and beta(DY)(k) (Dai-Yuan) formulae, i.e. beta(C)(k) = (1-theta(k))beta(HS)(k) + theta(k)beta(DY)(k). The parameter theta(k) in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (S-k, y(k)) to satisfy the modified secant condition given by Zhang et al. [32] and Zhang and Xu [33], where S-k = x(k+1) - x(k) and y(k) = g(k+1) - g(k). The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms a variant of the hybrid conjugate gradient algorithm given by Andrei [6], in which the pair (s(k), y(k)) satisfies the secant condition del(2)f(x(k+1))s(k) = y(k), as well as the Hestenes-Stiefel, the Dai-Yuan conjugate gradient algorithms, and the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.
引用
收藏
页码:373 / 392
页数:20
相关论文
共 25 条