Global Convergence Properties of Nonlinear Conjugate Gradient Methods with Modified Secant Condition

被引:0
|
作者
Hiroshi Yabe
Masahiro Takano
机构
[1] Tokyo University of Science,Department of Mathematical Information Science
[2] National Statistics Center,undefined
来源
Computational Optimization and Applications | 2004年 / 28卷
关键词
unconstrained optimization; conjugate gradient method; line search; global convergence; modified secant condition;
D O I
暂无
中图分类号
学科分类号
摘要
Conjugate gradient methods are appealing for large scale nonlinear optimization problems. Recently, expecting the fast convergence of the methods, Dai and Liao (2001) used secant condition of quasi-Newton methods. In this paper, we make use of modified secant condition given by Zhang et al. (1999) and Zhang and Xu (2001) and propose a new conjugate gradient method following to Dai and Liao (2001). It is new features that this method takes both available gradient and function value information and achieves a high-order accuracy in approximating the second-order curvature of the objective function. The method is shown to be globally convergent under some assumptions. Numerical results are reported.
引用
收藏
页码:203 / 225
页数:22
相关论文
共 50 条