A new fusion of grey wolf optimizer algorithm with a two-phase mutation for feature selection

被引:254
作者
Abdel-Basset, Mohamed [1 ]
El-Shahat, Doaa [2 ]
El-henawy, Ibrahim [2 ]
de Albuquerque, Victor Hugo C. [3 ]
Mirjalili, Seyedali [4 ]
机构
[1] Zagazig Univ, Fac Comp & Informat, Dept Operat Res, Zagazig, Egypt
[2] Zagazig Univ, Fac Comp & Informat, Comp Sci Dept, Zagazig, Egypt
[3] Univ Fortaleza, Fortaleza, Ceara, Brazil
[4] Torrens Univ Australia, 90 Bowen Terrace, Fortitude Valley, Qld 4006, Australia
关键词
Feature selection; Grey wolf optimization algorithm; Wrapper method; Classifier; accuracy; Cross-validation; Mutation; PARTICLE SWARM OPTIMIZATION; CROW SEARCH ALGORITHM; CLASSIFICATION;
D O I
10.1016/j.eswa.2019.112824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Because of their high dimensionality, dealing with large datasets can hinder the data mining process. Thus, the feature selection is a pre-process mandatory phase for reducing the dimensionality of datasets through using the most informative features and at the same time maximizing the classification accuracy. This paper proposes a new Grey Wolf Optimizer algorithm integrated with a Two-phase Mutation to solve the feature selection for classification problems based on the wrapper methods. The sigmoid function is used to transform the continuous search space to the binary one in order to match the binary nature of the feature selection problem. The two-phase mutation enhances the exploitation capability of the algorithm. The purpose of the first mutation phase is to reduce the number of selected features while preserving high classification accuracy. The purpose of the second mutation phase is to attempt to add more informative features that increase the classification accuracy. As the mutation phase can be time-consuming, the two-phase mutation can be done with a small probability. The wrapper methods can give high-quality solutions so we use one of the most famous wrapper methods which called k-Nearest Neighbor (k-NN) classifier. The Euclidean distance is computed to search for the k-NN. Each dataset is split into training and testing data using K-fold cross-validation to overcome the overfitting problem. Several comparisons with the most famous and modern algorithms such as flower algorithm, particle swarm optimization algorithm, multi-verse optimizer algorithm, whale optimization algorithm, and bat algorithm are done. The experiments are done using 35 datasets. Statistical analyses are made to prove the effectiveness of the proposed algorithm and its outperformance. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页数:14
相关论文
共 60 条
[1]   Binary Optimization Using Hybrid Grey Wolf Optimization for Feature Selection [J].
Al-Tashi, Qasem ;
Kadir, Said Jadid Abdul ;
Rais, Helmi Md ;
Mirjalili, Seyedali ;
Alhussian, Hitham .
IEEE ACCESS, 2019, 7 :39496-39508
[2]   AN INTRODUCTION TO KERNEL AND NEAREST-NEIGHBOR NONPARAMETRIC REGRESSION [J].
ALTMAN, NS .
AMERICAN STATISTICIAN, 1992, 46 (03) :175-185
[3]  
[Anonymous], 2017, INT J COMPUTER APPL
[4]   A New Hybrid Algorithm Based on Grey Wolf Optimization and Crow Search Algorithm for Unconstrained Function Optimization and Feature Selection [J].
Arora, Sankalap ;
Singh, Harpreet ;
Sharma, Manik ;
Sharma, Sanjeev ;
Anand, Priyanka .
IEEE ACCESS, 2019, 7 :26343-26361
[5]   Binary butterfly optimization approaches for feature selection [J].
Arora, Sankalap ;
Anand, Priyanka .
EXPERT SYSTEMS WITH APPLICATIONS, 2019, 116 :147-160
[6]  
Chen K., 2019, EXPERT SYSTEMS APPL
[7]  
Cho J. H., 2013, INT J FUZZY LOGIC IN, V13, P73
[8]   A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms [J].
Derrac, Joaquin ;
Garcia, Salvador ;
Molina, Daniel ;
Herrera, Francisco .
SWARM AND EVOLUTIONARY COMPUTATION, 2011, 1 (01) :3-18
[9]  
Eid Heba F., 2018, International Journal of Metaheuristics, V7, P67
[10]   Binary ant lion approaches for feature selection [J].
Emary, E. ;
Zawbaa, Hossam M. ;
Hassanien, Aboul Ella .
NEUROCOMPUTING, 2016, 213 :54-65