A new spectral conjugate gradient method for large-scale unconstrained optimization

被引:48
作者
Jian, Jinbao [1 ,2 ]
Chen, Qian [2 ]
Jiang, Xianzhen [1 ]
Zeng, Youfang [2 ]
Yin, Jianghua [2 ]
机构
[1] Yulin Normal Univ, Guangxi Coll & Univ Key Lab Complex Syst Optimiza, Yulin 537000, Peoples R China
[2] Guangxi Univ, Coll Math & Informat Sci, Nanning 530004, Peoples R China
关键词
large-scale unconstrained optimization; spectral conjugate gradient method; global convergence; numerical experiments; 90C25; 90C30; CONVERGENCE CONDITIONS; GLOBAL CONVERGENCE; ALGORITHM; DESCENT; MINIMIZATION; PROPERTY;
D O I
10.1080/10556788.2016.1225213
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The spectral conjugate gradient methods, with simple construction and nice numerical performance, are a kind of effective methods for solving large-scale unconstrained optimization problems. In this paper, based on quasi-Newton direction and quasi-Newton condition, and motivated by the idea of spectral conjugate gradient method as well as Dai-Kou's selecting technique for conjugate parameter [SIAM J. Optim. 23 (2013), pp. 296-320], a new approach for generating spectral parameters is presented, where a new double-truncating technique, which can ensure both the sufficient descent property of the search directions and the bounded property of the sequence of spectral parameters, is introduced. Then a new associated spectral conjugate gradient method for large-scale unconstrained optimization is proposed. Under either the strong Wolfe line search or the generalized Wolfe line search, the proposed method is always globally convergent. Finally, a large number of comparison numerical experiments on large-scale instances from one thousand to two million variables are reported. The numerical results show that the proposed method is more promising.
引用
收藏
页码:503 / 515
页数:13
相关论文
共 31 条
[1]  
Andrei N., 2008, ADV MODEL OPTIM, V10, P147, DOI DOI 10.1021/es702781x
[2]   A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization [J].
Andrei, Neculai .
APPLIED MATHEMATICS LETTERS, 2008, 21 (02) :165-171
[3]  
Andrei N, 2007, COMPUT OPTIM APPL, V38, P401, DOI [10.1007/s10589-007-9055-7, 10.1007/S10589-007-9055-7]
[4]   Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization [J].
Andrei, Neculai .
OPTIMIZATION METHODS & SOFTWARE, 2007, 22 (04) :561-571
[5]   A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization [J].
Andrei, Neculai .
APPLIED MATHEMATICS LETTERS, 2007, 20 (06) :645-650
[6]   New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization [J].
Andrei, Neculai .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2010, 234 (12) :3397-3410
[7]   Two modified scaled nonlinear conjugate gradient methods [J].
Babaie-Kafaki, Saman .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2014, 261 :172-182
[8]   2-POINT STEP SIZE GRADIENT METHODS [J].
BARZILAI, J ;
BORWEIN, JM .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1988, 8 (01) :141-148
[9]   A spectral conjugate gradient method for unconstrained optimization [J].
Birgin, EG ;
Martínez, JM .
APPLIED MATHEMATICS AND OPTIMIZATION, 2001, 43 (02) :117-128
[10]   CUTE - CONSTRAINED AND UNCONSTRAINED TESTING ENVIRONMENT [J].
BONGARTZ, I ;
CONN, AR ;
GOULD, N ;
TOINT, PL .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 1995, 21 (01) :123-160