Chaotic binary Group Search Optimizer for feature selection

被引:60
作者
Abualigah, Laith [1 ,2 ]
Diabat, Ali [3 ,4 ]
机构
[1] Amman Arab Univ, Fac Comp Sci & Informat, Amman 11953, Jordan
[2] Univ Sains Malaysia, Sch Comp Sci, George Town 11800, Malaysia
[3] New York Univ Abu Dhabi, Div Engn, Abu Dhabi 129188, U Arab Emirates
[4] NYU, Tandon Sch Engn, Dept Civil & Urban Engn, Brooklyn, NY 11201 USA
关键词
Group Search Optimizer (GSO); Chaotic maps; Feature selection (FS); Optimization problem; Meta-heuristic algorithm; PARTICLE SWARM OPTIMIZATION; TEXT FEATURE-SELECTION; GENE SELECTION; ALGORITHM; CLASSIFICATION;
D O I
10.1016/j.eswa.2021.116368
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection (FS) is recognized as one of the majority public and challenging problems in the Machine Learning domain. FS can be examined as an optimization problem that needs an effective optimizer to determine its optimal subset of more informative features. This paper proposes a wrapper FS method that combines chaotic maps (CMs) and binary Group Search Optimizer (GSO) called CGSO, which is used to solve the FS problem. In this method, five chaotic maps are incorporated with the GSO algorithm's main procedures, namely, Logistic, Piecewise, Singer, Sinusoidal, and Tent. The GSO algorithm is used as a search strategy, while k-NN is employed as an induction algorithm. The objective function is to integrate three main objectives: maximizing the classification accuracy value, minimizing the number of selected features, and minimizing the complexity of generated k-NN models. To evaluate the proposed methods' performance, twenty well-known UCI datasets are used and compared with other well-known published methods in the literature. The obtained results reveal the superiority of the proposed methods in outperforming other well-known methods, especially when using binary GSO with Tent CM. Finally, it is a beneficial method to be utilized in systems that require FS pre-processing.
引用
收藏
页数:16
相关论文
共 83 条
[11]   Simultaneous Feature Selection and Support Vector Machine Optimization Using the Grasshopper Optimization Algorithm [J].
Aljarah, Ibrahim ;
Al-Zoubi, Ala M. ;
Faris, Hossam ;
Hassonah, Mohammad A. ;
Mirjalili, Seyedali ;
Saadeh, Heba .
COGNITIVE COMPUTATION, 2018, 10 (03) :478-495
[12]  
Alomari Osama Ahmad, 2017, Journal of Theoretical and Applied Information Technology, V95, P2610
[13]   Gene selection for cancer classification by combining minimum redundancy maximum relevancy and bat-inspired algorithm [J].
Alomari, Osama Ahmad ;
Khader, Ahamad Tajudin ;
Al-Betar, Mohammed Azmi ;
Abualigah, Laith Mohammad .
INTERNATIONAL JOURNAL OF DATA MINING AND BIOINFORMATICS, 2017, 19 (01) :32-51
[14]  
[Anonymous], 2005, Elements of information theory
[15]  
[Anonymous], 2012, FEATURE SELECTION KN
[16]   A Novel Chaotic Interior Search Algorithm for Global Optimization and Feature Selection [J].
Arora, Sankalap ;
Sharma, Manik ;
Anand, Priyanka .
APPLIED ARTIFICIAL INTELLIGENCE, 2020, 34 (04) :292-328
[17]   Binary butterfly optimization approaches for feature selection [J].
Arora, Sankalap ;
Anand, Priyanka .
EXPERT SYSTEMS WITH APPLICATIONS, 2019, 116 :147-160
[18]   Chaotic grasshopper optimization algorithm for global optimization [J].
Arora, Sankalap ;
Anand, Priyanka .
NEURAL COMPUTING & APPLICATIONS, 2019, 31 (08) :4385-4405
[19]  
AZIZI R., 2014, International Journal of Computing, Communications and Networking, V3, P1
[20]  
Binitha S., 2012, Int. J. Soft Comput. Eng, V2, P137, DOI DOI 10.1007/S11269-015-0943-9