Gradient method with multiple damping for large-scale unconstrained optimization

被引:6
|
作者
Sim, Hong Seng [2 ,3 ,4 ]
Leong, Wah June [1 ,2 ,3 ]
Chen, Chuei Yee [2 ,3 ]
机构
[1] Chongqing Normal Univ, Sch Math Sci, Chongqing 401331, Peoples R China
[2] Univ Putra Malaysia, Inst Math Res, Serdang 43400, Malaysia
[3] Univ Putra Malaysia, Dept Math, Serdang 43400, Malaysia
[4] Xiamen Univ Malaysia, Sch Fdn Studies, Sepang 43600, Malaysia
关键词
Gradient method; Backtracking line search; Nonmonotone line search; Multiple damping; Large-scale optimization;
D O I
10.1007/s11590-018-1247-9
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Gradient methods are popular due to the fact that only gradient of the objective function is required. On the other hand, the methods can be very slow if the objective function is very ill-conditioned. One possible reason for the inefficiency of the gradient methods is that a constant criterion, which aims only at reducing the function value, has been used in choosing the steplength, and this leads to a stable dynamic system giving slow convergence. To overcome this, we propose a new gradient method with multiple damping, which works on the objective function and the norm of the gradient vector simultaneously. That is, the proposed method is constructed by combining damping with line search strategies, in which an individual adaptive parameter is proposed to damp the gradient vector while line searches are used to reduce the function value. Global convergence of the proposed method is established under both backtracking and nonmonotone line search. Finally, numerical results show that the proposed algorithm performs better than some well-known CG-based methods.
引用
收藏
页码:617 / 632
页数:16
相关论文
共 50 条
  • [1] Gradient method with multiple damping for large-scale unconstrained optimization
    Hong Seng Sim
    Wah June Leong
    Chuei Yee Chen
    Optimization Letters, 2019, 13 : 617 - 632
  • [2] A new accelerated conjugate gradient method for large-scale unconstrained optimization
    Yuting Chen
    Mingyuan Cao
    Yueting Yang
    Journal of Inequalities and Applications, 2019
  • [3] A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
    Yu, Gaohang
    Zhao, Yanlin
    Wei, Zengxin
    APPLIED MATHEMATICS AND COMPUTATION, 2007, 187 (02) : 636 - 643
  • [4] A spectral conjugate gradient method for solving large-scale unconstrained optimization
    Liu, J. K.
    Feng, Y. M.
    Zou, L. M.
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2019, 77 (03) : 731 - 739
  • [5] A new accelerated conjugate gradient method for large-scale unconstrained optimization
    Chen, Yuting
    Cao, Mingyuan
    Yang, Yueting
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2019, 2019 (01)
  • [6] A new spectral conjugate gradient method for large-scale unconstrained optimization
    Jian, Jinbao
    Chen, Qian
    Jiang, Xianzhen
    Zeng, Youfang
    Yin, Jianghua
    OPTIMIZATION METHODS & SOFTWARE, 2017, 32 (03): : 503 - 515
  • [8] A Conjugate Gradient Method with Global Convergence for Large-Scale Unconstrained Optimization Problems
    Yao, Shengwei
    Lu, Xiwen
    Wei, Zengxin
    JOURNAL OF APPLIED MATHEMATICS, 2013,
  • [9] An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
    Liu, Zexian
    Liu, Hongwei
    NUMERICAL ALGORITHMS, 2018, 78 (01) : 21 - 39
  • [10] An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
    Zexian Liu
    Hongwei Liu
    Numerical Algorithms, 2018, 78 : 21 - 39