Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization

被引:15
|
作者
Jiang, Xian-Zhen [1 ]
Jian, Jin-Bao [1 ,2 ]
机构
[1] Yulin Normal Univ, Sch Math & Informat Sci, Yulin 537000, Guangxi, Peoples R China
[2] Guangxi Univ, Coll Math & Informat Sci, Nanning 530004, Guangxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Unconstrained optimization; Conjugate gradient method; Disturbance factor; Global convergence; Numerical experiments; GLOBAL CONVERGENCE; PROPERTY; DESCENT;
D O I
10.1007/s11071-014-1303-7
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
The nonlinear conjugate gradient method (CGM) is a very effective iterative method for solving large-scale optimal problems. In this paper, based on a variant of Polak-RibiSre-Polyak method, two modified CGMs with disturbance factors are proposed. By the disturbance factors, the two proposed methods not only generate sufficient descent direction at each iteration but also converge globally for nonconvex minimization if the strong Wolfe line search is used. Finally, elementary numerical experiment results are reported, which show that the proposed methods are promising.
引用
收藏
页码:387 / 397
页数:11
相关论文
共 50 条