Global convergence of a memory gradient method without line search

被引:9
作者
Yu Z. [1 ]
机构
[1] College of Science, University of Shanghai for Science and Technology
基金
中国国家自然科学基金;
关键词
Global convergence; Memory gradient; Unconstrained optimization; Without line search;
D O I
10.1007/s12190-007-0021-4
中图分类号
学科分类号
摘要
In this paper, we develop a memory gradient method for unconstrained optimization. The main characteristic of this method is that we obtain the next iteration without any line search. Under certain conditions, we obtain the strong global convergence of the proposed method. © 2007 KSCAM and Springer-Verlag.
引用
收藏
页码:545 / 553
页数:8
相关论文
共 11 条
[1]  
Al-Baali M., Descent property and global convergence of the Fletcher-Reeves methods with inexact line search, IMA J. Numer. Anal., 5, pp. 121-124, (1985)
[2]  
Chen X.D., Sun J., Global convergence of a two-parameter family conjugate gradient methods without line search, J. Comput. Appl. Math., 146, pp. 37-45, (2002)
[3]  
Cragg E.E., Levy A.V., Study on supermemory gradient method for the minimization of function, J. Optim. Theory Appl., 4, pp. 191-205, (1969)
[4]  
Dai Y., Yuan Y., Nonlinear Conjugate Gradient Methods, (2000)
[5]  
Gilbert J.C., Nocedal J., Global convergence properties of a conjugate gradient method for optimization, SIAM J. Optim., 2, pp. 21-42, (1992)
[6]  
More J.J., Garbow B.S., Hillstorm K.E., Testing unconstrained optimization software, ACM Trans. Math. Softw., 7, pp. 17-41, (1981)
[7]  
Miele A., Cantrell J.W., Study on a memory gradient method for the minimization of function, J. Optim. Theory Appl., 3, pp. 459-470, (1969)
[8]  
Narushima Y., Yabe H., Global convergence of a memory gradient method for unconstrained optimization, Comput. Optim. Appl., 35, pp. 325-346, (2006)
[9]  
Shi Z.J., A new memory gradient method under exact line search, Asia-Pac. J. Oper. Res., 20, pp. 275-284, (2003)
[10]  
Shi Z.J., A new class of memory gradient methods with inexact line searches, J. Numer. Math., 13, pp. 53-72, (2005)