Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization

被引:13
作者
Andrei, Neculai [1 ]
机构
[1] Res Inst Informat, Ctr Adv Modeling & Optimizat, 8-10 Averescu Ave, Bucharest 1, Romania
关键词
unconstrained optimization; conjugate gradient algorithms; eigenvalues; singular values; Wolfe conditions; convergence; sufficient descent condition; conjugacy condition; 49M07; 49M10; 90C06; 65K; GLOBAL CONVERGENCE; DESCENT;
D O I
10.1080/10556788.2016.1225211
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Two different approaches based on eigenvalues and singular values of the matrix representing the search direction in conjugate gradient algorithms are considered. Using a special approximation of the inverse Hessian of the objective function, which depends by a positive parameter, we get the search direction which satisfies both the sufficient descent condition and the Dai-Liao's conjugacy condition. In the first approach the parameter in the search direction is determined by clustering the eigenvalues of the matrix defining it. The second approach uses the minimizing the condition number of the matrix representing the search direction. In this case the obtained conjugate gradient algorithm is exactly the three-term conjugate gradient algorithm proposed by Zhang, Zhou and Li. The global convergence of the algorithms is proved for uniformly convex functions. Intensive numerical experiments, using 800 unconstrained optimization test problems, prove that both these approaches have similar numerical performances. We prove that both algorithms are significantly more efficient and more robust than CG-DESCENT algorithm by Hager and Zhang. By solving five applications from the MINPACK-2 test problem collection, with 106 variables, we show that the suggested conjugate gradient algorithms are top performer versus CG-DESCENT.
引用
收藏
页码:534 / 551
页数:18
相关论文
共 44 条
[1]   Acceleration of conjugate gradient algorithms for unconstrained optimization [J].
Andrei, Neculai .
APPLIED MATHEMATICS AND COMPUTATION, 2009, 213 (02) :361-369
[2]  
[Anonymous], 1984, Numerical Methods for Nonlinear Variational Problems
[3]  
[Anonymous], 1992, MCSP1530692 ARG NAT
[4]  
[Anonymous], TECHNICAL REPORT
[5]  
Aris R., 1975, The mathematical theory of diffusion and reaction in permeable catalysts: Vol 1: The theory of the steady state (v.1), V1
[6]   ON THE RATE OF CONVERGENCE OF THE PRECONDITIONED CONJUGATE-GRADIENT METHOD [J].
AXELSSON, O ;
LINDSKOG, G .
NUMERISCHE MATHEMATIK, 1986, 48 (05) :499-523
[7]  
Axelsson O., 1976, Computer Methods in Applied Mechanics and Engineering, V9, P123, DOI 10.1016/0045-7825(76)90056-6
[8]   A modified scaled conjugate gradient method with global convergence for nonconvex functions [J].
Babaie-Kafaki, Saman ;
Ghanbari, Reza .
BULLETIN OF THE BELGIAN MATHEMATICAL SOCIETY-SIMON STEVIN, 2014, 21 (03) :465-477
[9]   The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices [J].
Babaie-Kafaki, Saman ;
Ghanbari, Reza .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2014, 234 (03) :625-630
[10]  
Bebernes J., 1989, MATH PROBLEMS COMBUS