The Global Convergence Properties of a Conjugate Gradient Method

被引:2
作者
Omer, Osman [1 ]
Mamat, Mustafa [2 ]
Abashar, Abdelrhaman [1 ]
Rivaie, Mohd [3 ]
机构
[1] Univ Malaysia Terengganu, Fac Sci & Technol, Dept Math, Terengganu, Malaysia
[2] Univ Sultan Zainal Abidin, Fac Informat & Comp, Kuala Terengganu, Malaysia
[3] Univ Teknol MARA UiTM Terengganu, Dept Comp Sci & Math, Terengganu, Malaysia
来源
PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON MATHEMATICAL SCIENCES | 2014年 / 1602卷
关键词
Conjugate gradient method; The strong Wolfe line search; sufficient descent property; global convergence; ASCENT METHODS;
D O I
10.1063/1.4882501
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Conjugate gradient methods are the most famous methods for solving nonlinear unconstrained optimization problems, especially large scale problems. That is, for its simplicity and low memory requirement. The strong Wolfe line search are usually used in practice for the analyses and implementations of conjugate gradient methods. In this paper, we present a new method of nonlinear conjugate gradient method with strong Wolfe line search for unconstrained optimization problems. Under some assumptions, the sufficient descent property and the global convergence are given. The numerical results show that our new method is efficient for some unconstrained optimization problems.
引用
收藏
页码:286 / 295
页数:10
相关论文
共 18 条