Nonlinear Optimization Using Adaptive Restarting Conjugate Gradient Methods; [적응 재시작 공액 기울기 방법을 사용한 비선형 최적화]

被引:0
作者
Kim, Sung-Soo [1 ]
机构
[1] Dept. of Electrical and Electronic Engineering, Chungbuk national University
关键词
Adaptive Restart; Stagnation; Unconstrained nonlinear conjugate gradient methods;
D O I
10.5370/KIEE.2024.73.8.1437
中图分类号
学科分类号
摘要
Conjugate gradient methods are optimization techniques used to minimize cost functions in nonconvex problem domains. However, in non-quadratic conjugate gradient methods, challenges often arise due to exact line searches and the need for effective restart procedures to enhance convergence properties. This paper introduces a modified conjugate gradient method that incorporates adaptive restarting, specifically designed for nonconvex objective functions, with the goal of preventing stagnation in convergence iterations. The adaptive restarting conjugate gradient approach aims to increase the likelihood of eliminating convergence stagnation. Through numerical investigations, the paper demonstrates the superior performance of the proposed restarting method, showcasing improved convergence behavior by effectively mitigating stagnation in the convergence process. © The Korean Institute of Electrical Engineers.
引用
收藏
页码:1437 / 1448
页数:11
相关论文
共 19 条
  • [11] Hager William W., Zhang Hongchao, A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search, SIAM J. OPTIM, 16, 1, pp. 170-192, (2005)
  • [12] Hager William W., Zhang Hongchao, A Survey of Nonlinear Conjugate Gradient Method, Pacific Journal of Optimization, (2006)
  • [13] Yuan Gonglin, Lu Xiwen, A Modified PRP conjugate gradient method, Ann Oper Res, 166, pp. 73-90, (2009)
  • [14] Yuan Gonglin, Lu Xiwen, Wang Zhan, The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration, Applied Numerical Mathematics, 152, pp. 1-11, (2020)
  • [15] Andrei Neculai, Nonlinear Conjugate Gradient Methods for Unconstrained Optimization, 158, pp. 67-214, (2020)
  • [16] Chan-Renous-Legoubin Reme, Royer Clement W., A Nonlinear conjugate gradient method with complexity guarantees and its application to non-convex regression, EURO J. Comput. Optim, 10, (2022)
  • [17] Abdelrahman Awad, Mohammed Morgataba, Osman O., Murtada K., A Nonlinear conjugate gradient Coefficients with Exact and Strong Wolfe Line Searches Techniques, Journal of Mathematics, 2022, 1
  • [18] Cartis C., Gould N. I. M., Toint Ph. L., Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization, Mathematical Programming, 163, 1, pp. 359-368, (2017)
  • [19] Cartis C., Sampaio Ph. R., Toint Ph. L., Worst-case evaluation complexity of non-monotone gradient-related algorithms for unconstrained optimization, Optimization, 64, 5, pp. 1349-1361, (2014)