IBJA: An improved binary DJaya algorithm for feature selection

被引:20
作者
Abed-alguni, Bilal H. [1 ]
AL-Jarah, Saqer Hamzeh [1 ]
机构
[1] Yarmouk Univ, Dept Comp Sci, Irbid, Jordan
关键词
DJaya; Harris Hawks optimization; Feature selection; Dynamic opposition-based learning; GREY WOLF OPTIMIZER; JAYA ALGORITHM; DESIGN;
D O I
10.1016/j.jocs.2023.102201
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Feature Selection (FS) is a special preprocessing step in Machine Learning (ML) that reduces the number of unwanted features in datasets to increase the accuracy of ML classifiers. A popular binary variant of the continuous Jaya algorithm is the Discrete Jaya (DJaya) algorithm. It is commonly used for addressing optimization problems with binary design variables (also known as binary decision variables). Nevertheless, DJaya tends to prematurely converge to local optimal solutions, and its performance deteriorates as the complexity of the optimization problem grows. The DJaya algorithm is improved in this article by introducing the Improved Binary DJaya Algorithm (IBJA), which is specially designed to solve the FS problem. IBJA includes three techniques in DJaya. First, it incorporates the update equation of the Harris Hawks Optimization (HHO) algorithm into the optimization loop of DJaya to enhance DJaya's searching process and exploration abilities. Second, it uses a new Dynamic Opposition-based Learning in the final steps of the optimization loop of DJaya to boost its searching and exploring capabilities. Third, it employs binary transfer functions to calculate binary solutions from the real-valued solutions generated by HHO and DOBL. Using 15 UCI datasets, IBJA's performance was assessed and compared against four ML classifiers and ten efficient optimization algorithms. Besides, the Friedman statistical test was employed to investigate the reliability of the experimental findings. According to the overall experimental and statistical findings, IBJA scored the highest accuracy, best objective value, and fewest chosen features for each of the 15 UCI datasets.
引用
收藏
页数:14
相关论文
共 86 条
[1]   Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems [J].
Abdollahzadeh, Benyamin ;
Gharehchopogh, Farhad Soleimanian ;
Mirjalili, Seyedali .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (10) :5887-5958
[2]  
Abed-Alguni Bilal H., 2016, International Journal of Artificial Intelligence, V14, P71
[3]  
Abed-alguni B.H, 2017, Jordanian J. Comput. Inform. Technol. (JJCIT), V3, P56
[4]  
Abed-alguni B.H., 2018, Jordan J Comput Inf Technol, V4, P21
[5]  
Abed-Alguni BH., 2019, Int. J. Reason.-based Intel. Syst, V11, P319, DOI [10.1504/ijris.2019.10025171, DOI 10.1504/IJRIS.2019.103525]
[6]  
Abed-alguni BH., 2019, Int. J. Artif. Intell, V17, P57
[7]  
Abed-alguni BH., 2018, INT J ARTIF INTELL, V16, P41
[8]   Opposition-based sine cosine optimizer utilizing refraction learning and variable neighborhood search for feature selection [J].
Abed-alguni, Bilal H. ;
Alawad, Noor Aldeen ;
Al-Betar, Mohammed Azmi ;
Paul, David .
APPLIED INTELLIGENCE, 2023, 53 (11) :13224-13260
[9]   Island-based Cuckoo Search with elite opposition-based learning and multiple mutation methods for solving optimization problems [J].
Abed-alguni, Bilal H. ;
Paul, David .
SOFT COMPUTING, 2022, 26 (07) :3293-3312
[10]   Distributed Grey Wolf Optimizer for scheduling of workflow applications in cloud environments [J].
Abed-alguni, Bilal H. ;
Alawad, Noor Aldeen .
APPLIED SOFT COMPUTING, 2021, 102