GUARANTEED DESCENT CONJUGATE GRADIENT METHODS WITH MODIFIED SECANT CONDITION

被引:0
|
作者
Li, Shishun [1 ]
Huang, Zhengda [1 ]
机构
[1] Zhejiang Univ, Dept Math, Hangzhou 310027, Peoples R China
基金
中国国家自然科学基金;
关键词
Unconstrained optimization; conjugate gradient method; global convergence; line search; standard Wolfe conditions;
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Conjugate gradient methods are typically used to solve large scale unconstrained optimization problems. Recently, Hager and Zhang (2006) proposed two guaranteed descent conjugate gradient methods. In this paper, following Hager and Zhang (2006), we will use the modified secant condition given by Zhang et al.(1999) to present two new descent conjugate gradient methods. An interesting feature of these new methods is that they take both the gradient and function value information. Under some suitable assumptions, global convergence properties for these methods are established. Numerical comparisons with the Hager-Zhang methods are given.
引用
收藏
页码:739 / 755
页数:17
相关论文
共 50 条