ON THE USE OF POWELL'S RESTART STRATEGY TO CONJUGATE GRADIENT METHODS

被引:0
作者
Kou, Cai-Xia [1 ]
Zhang, Wen-Hui [2 ]
Ai, Wen-Bao [1 ]
Liu, Ya-Feng [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Sci, Beijing 100876, Peoples R China
[2] Chinese Acad Sci, Acad Math & Syst Sci, Inst Computat Math & Sci Engn Comp, State Key Lab Sci & Engn Comp, Beijing 100190, Peoples R China
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2014年 / 10卷 / 01期
关键词
CG method; unconstrained optimization; Powell's restart strategy; global convergence; generalized improved Wolfe line search; FLETCHER-REEVES METHOD; LINE SEARCH; GLOBAL CONVERGENCE; OPTIMIZATION; PROPERTY; MINIMIZATION; ALGORITHM; DESCENT;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Restart strategy is often used in conjugate gradient (CG) methods to improve their computational efficiency. In this paper, we propose to apply Powell's restart strategy [20] to four classical CG methods including the Fletcher-Reeves (FR) method [10], the Polak-Ribiere-Polyak (PRP) method [18,19], the Hestenes-Stiefel (HS) method [15], and the Dai-Yuan (DY) method [6]. We show that the corresponding variants of all aforementioned CG methods enjoy the sufficient descent property and are globally convergent if a generalized improved Wolfe line search is performed. Numerical results show that the use of Powell's restart strategy to the four classical CG methods can significantly improve their computational efficiency. In particular, the numerical performance of the corresponding variant of the HS method compares favorably with the one of the Dai-Kou's CGOPT in [3].
引用
收藏
页码:85 / 104
页数:20
相关论文
共 24 条
[1]   DESCENT PROPERTY AND GLOBAL CONVERGENCE OF THE FLETCHER REEVES METHOD WITH INEXACT LINE SEARCH [J].
ALBAALI, M .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1985, 5 (01) :121-124
[2]  
Beale E.M. L., 1972, Numerical Methods for Nonlinear Optimization, P39
[3]  
Dai Y. H., 2000, Nonlinear Conjugate Gradient Methods
[4]   Convergence properties of the Fletcher-Reeves method [J].
Dai, YH ;
Yuan, Y .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1996, 16 (02) :155-164
[5]   An efficient hybrid conjugate gradient method for unconstrained optimization [J].
Dai, YH ;
Yuan, Y .
ANNALS OF OPERATIONS RESEARCH, 2001, 103 (1-4) :33-47
[6]   New conjugacy conditions and related nonlinear conjugate gradient methods [J].
Dai, YH ;
Liao, LZ .
APPLIED MATHEMATICS AND OPTIMIZATION, 2001, 43 (01) :87-101
[7]   A nonlinear conjugate gradient method with a strong global convergence property [J].
Dai, YH ;
Yuan, Y .
SIAM JOURNAL ON OPTIMIZATION, 1999, 10 (01) :177-182
[8]   A NONLINEAR CONJUGATE GRADIENT ALGORITHM WITH AN OPTIMAL PROPERTY AND AN IMPROVED WOLFE LINE SEARCH [J].
Dai, Yu-Hong ;
Kou, Cai-Xia .
SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (01) :296-320
[9]   Benchmarking optimization software with performance profiles [J].
Dolan, ED ;
Moré, JJ .
MATHEMATICAL PROGRAMMING, 2002, 91 (02) :201-213
[10]   FUNCTION MINIMIZATION BY CONJUGATE GRADIENTS [J].
FLETCHER, R ;
REEVES, CM .
COMPUTER JOURNAL, 1964, 7 (02) :149-&