A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach

被引:19
|
作者
Babaie-Kafaki, Saman [1 ]
Ghanbari, Reza [2 ]
机构
[1] Semnan Univ, Fac Math Stat & Comp Sci, Dept Math, Semanan, Iran
[2] Ferdowsi Univ Mashhad, Fac Math Sci, Mashhad, Iran
来源
OPTIMIZATION METHODS & SOFTWARE | 2015年 / 30卷 / 04期
关键词
unconstrained optimization; conjugate gradient method; least-squares; global convergence; GLOBAL CONVERGENCE; DESCENT; ALGORITHM; OPTIMIZATION; PROPERTY;
D O I
10.1080/10556788.2014.966825
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Following Andrei's approach of combining the conjugate gradient parameters convexly, a hybridization of the Hestenes-Stiefel (HS) and Dai-Yuan conjugate gradient (CG) methods is proposed. The hybridization parameter is computed by solving the least-squares problem of minimizing the distance between search directions of the hybrid method and a three-term conjugate gradient method proposed by Zhang et al. which possesses the sufficient descent property. Also, Powell's non-negative restriction of the HS CG parameter is employed in the hybrid method. A brief global convergence analysis is made without convexity assumption on the objective function. Comparative testing results are reported; they demonstrate efficiency of the proposed hybrid CG method in the sense of the Dolan-More performance profile.
引用
收藏
页码:673 / 681
页数:9
相关论文
共 30 条