Boosting Arithmetic Optimization Algorithm with Genetic Algorithm Operators for Feature Selection: Case Study on Cox Proportional Hazards Model

被引:59
作者
Ewees, Ahmed A. [1 ]
Al-qaness, Mohammed A. A. [2 ]
Abualigah, Laith [3 ,4 ]
Oliva, Diego [5 ]
Algamal, Zakariya Yahya [6 ]
Anter, Ahmed M. [7 ]
Ali Ibrahim, Rehab [8 ]
Ghoniem, Rania M. [9 ,10 ]
Abd Elaziz, Mohamed [8 ,11 ,12 ]
机构
[1] Damietta Univ, Dept Comp, Dumyat 34517, Egypt
[2] Wuhan Univ, State Key Lab Informat Engn Surveying Mapping & R, Wuhan 430079, Peoples R China
[3] Amman Arab Univ, Fac Comp Sci & Informat, Amman 11953, Jordan
[4] Univ Sains Malaysia, Sch Comp Sci, Gelugor 11800, Malaysia
[5] Univ Guadalajara, Dept Comp Sci, Ctr Univ Ciencias Exactas & Ingn CUCEI, Guadalajara 44430, Mexico
[6] Univ Mosul, Dept Stat & Informat, Mosul 41002, Iraq
[7] Beni Suef Univ, Fac Comp & Artificial Intelligence, Bani Suwayf 62511, Egypt
[8] Zagazig Univ, Dept Math, Fac Sci, Zagazig 44519, Egypt
[9] Princess Nourah bint Abdulrahman Univ, Coll Comp & Informat Sci, Dept Informat Technol, Riyadh 84428, Saudi Arabia
[10] Mansoura Univ, Dept Comp, Mansoura 35516, Egypt
[11] Ajman Univ, Artificial Intelligence Res Ctr AIRC, POB 346, Ajman, U Arab Emirates
[12] Tomsk Polytech Univ, Sch Comp Sci & Robot, Tomsk 634050, Russia
关键词
feature selection; data mining; machine learning; Arithmetic Optimization Algorithm (AOA); genetic algorithm; PARTICLE SWARM OPTIMIZATION; TEXT FEATURE-SELECTION; PREDICT SURVIVAL; GA ALGORITHM; WOLF; REDUCTION; SYSTEM; SCHEME;
D O I
10.3390/math9182321
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Feature selection is a well-known prepossessing procedure, and it is considered a challenging problem in many domains, such as data mining, text mining, medicine, biology, public health, image processing, data clustering, and others. This paper proposes a novel feature selection method, called AOAGA, using an improved metaheuristic optimization method that combines the conventional Arithmetic Optimization Algorithm (AOA) with the Genetic Algorithm (GA) operators. The AOA is a recently proposed optimizer; it has been employed to solve several benchmark and engineering problems and has shown a promising performance. The main aim behind the modification of the AOA is to enhance its search strategies. The conventional version suffers from weaknesses, the local search strategy, and the trade-off between the search strategies. Therefore, the operators of the GA can overcome the shortcomings of the conventional AOA. The proposed AOAGA was evaluated with several well-known benchmark datasets, using several standard evaluation criteria, namely accuracy, number of selected features, and fitness function. Finally, the results were compared with the state-of-the-art techniques to prove the performance of the proposed AOAGA method. Moreover, to further assess the performance of the proposed AOAGA method, two real-world problems containing gene datasets were used. The findings of this paper illustrated that the proposed AOAGA method finds new best solutions for several test cases, and it got promising results compared to other comparative methods published in the literature.
引用
收藏
页数:22
相关论文
共 75 条
[1]   Advanced metaheuristic optimization techniques in applications of deep neural networks: a review [J].
Abd Elaziz, Mohamed ;
Dahou, Abdelghani ;
Abualigah, Laith ;
Yu, Liyang ;
Alshinwan, Mohammad ;
Khasawneh, Ahmad M. ;
Lu, Songfeng .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (21) :14079-14099
[2]   Opposition-based moth-flame optimization improved by differential evolution for feature selection [J].
Abd Elaziz, Mohamed ;
Ewees, Ahmed A. ;
Ibrahim, Rehab Ali ;
Lu, Songfeng .
MATHEMATICS AND COMPUTERS IN SIMULATION, 2020, 168 (168) :48-75
[3]   A new fusion of grey wolf optimizer algorithm with a two-phase mutation for feature selection [J].
Abdel-Basset, Mohamed ;
El-Shahat, Doaa ;
El-henawy, Ibrahim ;
de Albuquerque, Victor Hugo C. ;
Mirjalili, Seyedali .
EXPERT SYSTEMS WITH APPLICATIONS, 2020, 139
[4]  
Abualigah L.M.Q., 2019, FEATURE SELECTION EN
[5]   Aquila Optimizer: A novel meta-heuristic optimization algorithm [J].
Abualigah, Laith ;
Yousri, Dalia ;
Abd Elaziz, Mohamed ;
Ewees, Ahmed A. ;
Al-qaness, Mohammed A. A. ;
Gandomi, Amir H. .
COMPUTERS & INDUSTRIAL ENGINEERING, 2021, 157 (157)
[6]   The Arithmetic Optimization Algorithm [J].
Abualigah, Laith ;
Diabat, Ali ;
Mirjalili, Seyedali ;
Elaziz, Mohamed Abd ;
Gandomi, Amir H. .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2021, 376
[7]   A parallel hybrid krill herd algorithm for feature selection [J].
Abualigah, Laith ;
Alsalibi, Bisan ;
Shehab, Mohammad ;
Alshinwan, Mohammad ;
Khasawneh, Ahmad M. ;
Alabool, Hamzeh .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (03) :783-806
[8]   A new feature selection method to improve the document clustering using particle swarm optimization algorithm [J].
Abualigah, Laith Mohammad ;
Khader, Ahamad Tajudin ;
Hanandeh, Essam Said .
JOURNAL OF COMPUTATIONAL SCIENCE, 2018, 25 :456-466
[9]   Unsupervised text feature selection technique based on hybrid particle swarm optimization algorithm with genetic operators for the text clustering [J].
Abualigah, Laith Mohammad ;
Khader, Ahamad Tajudin .
JOURNAL OF SUPERCOMPUTING, 2017, 73 (11) :4773-4795
[10]   Text feature selection with a robust weight scheme and dynamic dimension reduction to text document clustering [J].
Abualigah, Laith Mohammad ;
Khader, Ahamad Tajudin ;
Al-Betar, Mohammed Azmi ;
Alomari, Osama Ahmad .
EXPERT SYSTEMS WITH APPLICATIONS, 2017, 84 :24-36