An Improved Cuckoo Search Algorithm for Optimization of Artificial Neural Network Training

被引:0
作者
Pedda Nagyalla Maddaiah
Pournami Pulinthanathu Narayanan
机构
[1] National Institute of Technology Calicut,Department of Computer Science and Engineering
来源
Neural Processing Letters | 2023年 / 55卷
关键词
Artificial neural network; Metaheuristic; Numerical optimization; Cuckoo search; Voronoi diagram;
D O I
暂无
中图分类号
学科分类号
摘要
Artificial neural networks are widely used for solving engineering design problems of various disciplines due to its simplicity, efficiency, and adaptability. It predicts promising and accurate results. Artificial neural network solves these problems with weights and biases obtained in the training process. In training, the weights and biases have to be updated such that the difference between predicted and actual values has to be minimized. The artificial neural network uses stochastic gradient steepest descent methods to update the weights and biases for optimizing it. These methods are good at finding the optimum solution. However, they suffer from the drawbacks of vanishing gradient at local minima and critical points and are sensitive to initial weights and biases. As a result, it falls into local minima, the training time becomes high, and accuracy becomes low. One of the best solutions to overcome these problems is to use metaheuristics algorithms instead of stochastic gradient descent methods. Among metaheuristics, the cuckoo search algorithm is widely used in many applications due to its simplicity and efficiency. In this work, we proposed an improved Cuckoo search algorithm by incorporating Voronoi diagram with Cuckoo search to strengthen the weak areas of Cuckoo search and to overcome the addressed problems of the artificial neural network. The proposed Cuckoo search algorithm performance is tested on higher dimensional benchmark functions and on benchmark data sets. Moreover, its performance is compared with variants of Cuckoo search and other metaheuristic algorithms. The proposed algorithm has shown better results in terms of the number of generations, accuracy, cross-entropy, and root mean square error (RMSE).
引用
收藏
页码:12093 / 12120
页数:27
相关论文
共 121 条
[1]  
Yu X(2002)A general backpropagation algorithm for feedforward neural networks learning IEEE Trans Neural Netw 13 251-254
[2]  
Efe MO(2013)Generation of neural networks using a genetic algorithm approach Int J Bio-Inspired Comput 5 289-302
[3]  
Kaynak O(2019)Functional networks and applications: a survey Neurocomputing 335 384-399
[4]  
Trujillo-Romero F(2007)A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training Appl Math Comput 185 1026-1037
[5]  
Zhou G(2017)Prediction of high-speed grinding temperature of titanium matrix composites using BP neural network based on PSO algorithm Int J Adv Manuf Technol 89 2277-2285
[6]  
Zhou Y(2012)Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm Appl Math Comput 218 11125-11137
[7]  
Huang H(2020)Hybrid Wolf-Bat algorithm for optimization of connection weights in multi-layer perceptron ACM . Multimedia Comput Commun Appl 16 37-13720
[8]  
Tang Z(2016)Training feedforward neural networks using multi-verse optimizer for binary classification problems Appl Intell 45 322-332
[9]  
Zhang J-R(2021)Using spotted hyena optimizer for training feedforward neural networks Cognitive Syst Res 65 1-16
[10]  
Zhang J(2021)Neighborhood centroid opposite-based learning Harris Hawks optimization for training neural networks Evol Intel 14 1847-1867