Enhanced Conjugate Gradient Method for Unconstrained Optimization and Its Application in Neural Networks

被引:1
作者
Omar, Dlovan Haji [1 ]
Ibrahim, Alaa Luqman [1 ]
Hassan, Masoud Muhammed [2 ]
Fathi, Bayda Ghanim [1 ]
Sulaiman, Diman Abdulqader [1 ]
机构
[1] Univ Zakho, Coll Sci, Dept Math, Zakho, Kurdistan Regio, Iraq
[2] Univ Zakho, Coll Sci, Dept Comp Sci, Zakho, Kurdistan Regio, Iraq
来源
EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS | 2024年 / 17卷 / 04期
关键词
Optimization; conjugate gradient; neural networks; convergence analysis;
D O I
10.29020/nybg.ejpam.v17i4.5354
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this study, we present a novel conjugate gradient method specifically designed for addressing with unconstrained optimization problems. Traditional conjugate gradient methods have shown effectiveness in solving optimization problems, but they may encounter challenges when dealing with unconstrained problems. Our method addresses this issue by introducing modifications that enhance its performance in the unconstrained setting. We demonstrate that, under certain conditions, our method satisfies both the sufficient descent criteria and establishes global convergence, ensuring progress towards the optimal solution at each iteration. Moreover, we establish the global convergence of our method, providing confidence in its ability to find the global optimum. To showcase the practical applicability of our approach, we apply this novel method to a dataset, applying a feed-forward neural network value estimation for continuous trigonometric function value estimation. To evaluate the efficiency and effectiveness of our modified approach, we conducted numerical experiments on a set of well-known test functions. These experiments reveal that our algorithm significantly reduces computational time due to its faster convergence rates and increased speed in directional minimization. These compelling results highlight the advantages of our approach over traditional conjugate gradient methods in the context of unconstrained optimization problems.
引用
收藏
页码:2692 / 2705
页数:14
相关论文
共 37 条
[1]   A three-term conjugate gradient descent method with some applications [J].
Alhawarat, Ahmad ;
Salleh, Zabidin ;
Alolaiyan, Hanan ;
El Hor, Hamid ;
Ismail, Shahrina .
JOURNAL OF INEQUALITIES AND APPLICATIONS, 2024, 2024 (01)
[2]  
Andrei N., 2020, Nonlinear Conjugate Gradient Methods for Unconstrained Optimization
[3]  
Andrei N., 2008, ADV MODEL OPTIM, V10, P147, DOI [DOI 10.1002/ADEM.200890003, DOI 10.1021/ES702781X]
[4]   Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization [J].
Andrei, Neculai .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2009, 230 (02) :570-582
[5]  
Bishop CM, 1995, Neural networks for pattern recognition
[6]   An efficient hybrid conjugate gradient method with an adaptive strategy and applications in image restoration problems [J].
Chen, Zibo ;
Shao, Hu ;
Liu, Pengjie ;
Li, Guoxin ;
Rong, Xianglin .
APPLIED NUMERICAL MATHEMATICS, 2024, 204 :362-379
[7]   Convergence properties of nonlinear conjugate gradient methods [J].
Dai, YH ;
Han, JY ;
Liu, GH ;
Sun, DF ;
Yin, HX ;
Yuan, YX .
SIAM JOURNAL ON OPTIMIZATION, 2000, 10 (02) :345-358
[8]   A nonlinear conjugate gradient method with a strong global convergence property [J].
Dai, YH ;
Yuan, Y .
SIAM JOURNAL ON OPTIMIZATION, 1999, 10 (01) :177-182
[9]   FUNCTION MINIMIZATION BY CONJUGATE GRADIENTS [J].
FLETCHER, R ;
REEVES, CM .
COMPUTER JOURNAL, 1964, 7 (02) :149-&
[10]   A RAPIDLY CONVERGENT DESCENT METHOD FOR MINIMIZATION [J].
FLETCHER, R ;
POWELL, MJD .
COMPUTER JOURNAL, 1963, 6 (02) :163-&