CLASSIFICATION OF SONAR DATA SET USING NEURAL NETWORK TRAINED BY GRAY WOLF OPTIMIZATION

被引:60
作者
Mosavi, M. R. [1 ]
Khishe, M. [1 ]
Ghamgosar, A. [1 ]
机构
[1] Iran Univ Sci & Technol, Dept Elect Engn, Tehran 1684613114, Iran
关键词
classification; sonar; Multi-Layer Perceptron Neural Network; Grey Wolf Optimization; Particle Swarm Optimization; Gravitational Search Algorithm; ALGORITHM; EVOLUTIONARY;
D O I
10.14311/NNW.2016.26.023
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-Layer Perceptron Neural Networks (MLP NNs) are the commonly used NNs for target classification. They purposes not only in simulated environments, but also in actual situations. Training such NNs has significant importance in a way that many researchers have been attracted to this field recently. Conventional gradient descent and recursive method has long been used to train NNs. Improper classification accuracy, slow convergence speed and trapping in local minimums are disadvantages of the traditional methods. In order to overcome these issues, in recent years heuristic and meta-heuristic algorithms are widely used. This paper uses Gray Wolf Optimization (GWO) algorithm for training the NN. This algorithm is inspired by lifestyle and hunting method of GWs. GWO has a superior ability to solve the high-dimension problems, so we try to classify the Sonar dataset using this algorithm. To test the proposed method, this algorithm is compared to Particle Swarm Optimization (PSO) algorithm, Gravitational Search Algorithm (GSA) and the hybrid algorithm (i.e. PSOGSA) using three sets of data. Measured metrics are convergence speed, the possibility of trapping in local minimum and classification accuracy. The results show that the proposed algorithm in most cases provides better or comparable performance compared to the other mentioned algorithms.
引用
收藏
页码:393 / 415
页数:23
相关论文
共 44 条
[1]  
Abedifar V., 2013, 2013 21 IR C EL ENG, P1
[2]  
[Anonymous], 2013, IEEE PES INNOV SMART, DOI 10.1109/ISGT-LA.2013.6554383
[3]  
[Anonymous], 2001, MSc Thesis
[4]   A learning rule for very simple universal approximators consisting of a single layer of perceptrons [J].
Auer, Peter ;
Burgsteiner, Harald ;
Maass, Wolfgang .
NEURAL NETWORKS, 2008, 21 (05) :786-795
[5]   Parameter selection algorithm with self adaptive growing neural network classifier for diagnosis issues [J].
Barakat, M. ;
Lefebvre, D. ;
Khalil, M. ;
Druaux, F. ;
Mustapha, O. .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2013, 4 (03) :217-233
[6]   A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms [J].
Derrac, Joaquin ;
Garcia, Salvador ;
Molina, Daniel ;
Herrera, Francisco .
SWARM AND EVOLUTIONARY COMPUTATION, 2011, 1 (01) :3-18
[7]   Ant colony optimization -: Artificial ants as a computational intelligence technique [J].
Dorigo, Marco ;
Birattari, Mauro ;
Stuetzle, Thomas .
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2006, 1 (04) :28-39
[8]   Feature Subset Selection Approach by Gray-Wolf Optimization [J].
Emary, E. ;
Zawbaa, Hossam M. ;
Grosan, Crina ;
Hassenian, Abul Ella .
AFRO-EUROPEAN CONFERENCE FOR INDUSTRIAL ADVANCEMENT, AECIA 2014, 2015, 334 :1-13
[9]   A study on the use of non-parametric tests for analyzing the evolutionary algorithms' behaviour: a case study on the CEC'2005 Special Session on Real Parameter Optimization [J].
Garcia, Salvador ;
Molina, Daniel ;
Lozano, Manuel ;
Herrera, Francisco .
JOURNAL OF HEURISTICS, 2009, 15 (06) :617-644
[10]   ANALYSIS OF HIDDEN UNITS IN A LAYERED NETWORK TRAINED TO CLASSIFY SONAR TARGETS [J].
GORMAN, RP ;
SEJNOWSKI, TJ .
NEURAL NETWORKS, 1988, 1 (01) :75-89