An enhanced Harris hawk optimizer based on extreme learning machine for feature selection

被引:0
作者
Abdullah Alzaqebah
Omar Al-Kadi
Ibrahim Aljarah
机构
[1] The University of Jordan,King Abdullah II School for Information Technology
[2] The World Islamic Sciences and Education University,undefined
来源
Progress in Artificial Intelligence | 2023年 / 12卷
关键词
Machine learning; Harris hawk optimization; Feature selection; Extreme learning machine; Optimization;
D O I
暂无
中图分类号
学科分类号
摘要
The growth of data creates more analysis and mining challenges related to speed and accuracy. Feature selection (FS) is an optimization problem used as a preprocessing phase to reduce the data dimensionality while obtaining the best classification accuracy. FS removes redundant and irrelevant features and preserves the best informative features. Various meta-heuristic optimization algorithms were employed in the literature to solve the FS problem. This paper proposes an improved Harris hawk optimization algorithm called (IHHO) to find the optimal feature set for classification purposes in a wrapper-based environment. Three main improvements are obtained in the binary version of HHO. The first improvement is to speed up the convergence, which is implemented using the most informative features in population initialization. Both filter-based and wrapper-based techniques are used during the initialization phase. The second one is to ensure the global and local search and avoid trapping into local optima using the X-shaped transfer function. While the third one is using the extreme learning machine as the base classifier to guide the searching process, speed up the convergence, and improve the accuracy of the FS process. The proposed model was evaluated using 18 well-known UCI benchmarks and compared with traditional HHO, particle swarm optimization, gray wolf optimizer, grasshopper optimization algorithm, and five standard filter-based techniques. The experiment results prove the superior performance of the IHHO compared to other algorithms and methods presented in the literature.
引用
收藏
页码:77 / 97
页数:20
相关论文
共 101 条
[1]  
Akhtar MS(2017)Feature selection and ensemble construction: a two-step method for aspect based sentiment analysis Knowl.-Based Syst. 125 116-135
[2]  
Gupta D(2011)A feature selection method based on improved fisher’s discriminant ratio for text sentiment classification Expert Syst. Appl. 38 8696-8702
[3]  
Ekbal A(2005)Toward integrating feature selection algorithms for classification and clustering IEEE Trans. Knowl. Data Eng. 17 491-502
[4]  
Bhattacharyya P(2014)Feature selection for high-dimensional class-imbalanced data sets using support vector machines Inf. Sci. 286 228-246
[5]  
Wang S(2020)An improved dragonfly algorithm for feature selection Knowl.-Based Syst. 203 267-286
[6]  
Li D(2021)Bepo: a novel binary emperor penguin optimizer for automatic feature selection Knowl.-Based Syst. 211 67-82
[7]  
Song X(2019)Binary grasshopper optimisation algorithm approaches for feature selection problems Expert Syst. Appl. 117 97890-97906
[8]  
Wei Y(1997)No free lunch theorems for optimization IEEE Trans. Evol. Comput. 1 849-872
[9]  
Li H(2020)Binary social mimic optimization algorithm with x-shaped transfer function for feature selection IEEE Access 8 1-14
[10]  
Liu H(2020)Improved salp swarm algorithm based on opposition based learning and novel local search algorithm for feature selection Expert Syst. Appl. 145 3013-3042