Improved coral reefs optimization with adaptive β-hill climbing for feature selection

被引:0
作者
Ahmed, Shameem [1 ]
Ghosh, Kushal Kanti [1 ]
Garcia-Hernandez, Laura [2 ]
Abraham, Ajith [3 ]
Sarkar, Ram [1 ]
机构
[1] Jadavpur Univ, Dept Comp Sci & Engn, Kolkata, India
[2] Univ Cordoba, Area Project Engn, Cordoba, Spain
[3] Machine Intelligence Res Labs, Auburn, WA USA
关键词
Meta-heuristic; Feature selection; UCI; Coral reefs optimization; Adaptive beta-hill climbing; Hybrid optimization; GENETIC ALGORITHM;
D O I
10.1007/s00521-020-05409-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
For any classification problem, the dimension of the feature vector used for classification has great importance. This is because, in a high-dimensional feature vector, it is found that some are non-informative or even redundant as they do not contribute to the learning process of the classifier. Rather, they may be the reason for low classification accuracy and high training time of the learning model. To address this issue, researchers apply various feature selection (FS) methods as found in the literature. In recent years, meta-heuristic algorithms have been proven to be effective in solving FS problems. The Coral Reefs Optimizer (CRO) which is a cellular type evolutionary algorithms has good tuning between its exploration and exploitation ability. This has motivated us to present an improved version of CRO with the inclusion of adaptive beta-hill climbing to increase the exploitation ability of CRO. The proposed method is assessed on 18 standard UCI-datasets by means of three distinct classifiers, KNN, Random Forest and Naive Bayes classifiers. It is also analyzed with 10 state-of-the-art meta-heuristics FS procedure, and the outputs show an excellent performance of the proposed FS method reaching better results than the previous methods considered here for comparison. The source code of this work is publicly available at https://github.com/ahmed-shameem/Projects.
引用
收藏
页码:6467 / 6486
页数:20
相关论文
共 69 条
  • [1] Al-Betar MA, 2017, NEURAL COMPUT APPL, V28, pS153, DOI 10.1007/s00521-016-2328-2
  • [2] Al-Betar MA, 2019, SOFT COMPUT, V23, P13489, DOI 10.1007/s00500-019-03887-7
  • [3] Binary Optimization Using Hybrid Grey Wolf Optimization for Feature Selection
    Al-Tashi, Qasem
    Kadir, Said Jadid Abdul
    Rais, Helmi Md
    Mirjalili, Seyedali
    Alhussian, Hitham
    [J]. IEEE ACCESS, 2019, 7 : 39496 - 39508
  • [4] Alba E, 2008, OPER RES COMPUT SCI, V42, P3, DOI 10.1007/978-0-387-77610-1_1
  • [5] Almomani A., 2019, Machine Learning for Computer and Cyber Security, P184
  • [6] AN INTRODUCTION TO KERNEL AND NEAREST-NEIGHBOR NONPARAMETRIC REGRESSION
    ALTMAN, NS
    [J]. AMERICAN STATISTICIAN, 1992, 46 (03) : 175 - 185
  • [7] The monarch butterfly optimization algorithm for solving feature selection problems
    Alweshah, Mohammed
    Al Khalaileh, Saleh
    Gupta, Brij B.
    Almomani, Ammar
    Hammouri, Abdelaziz, I
    Al-Betar, Mohammed Azmi
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (14) : 11267 - 11281
  • [8] A hybrid mine blast algorithm for feature selection problems
    Alweshah, Mohammed
    Alkhalaileh, Saleh
    Albashish, Dheeb
    Mafarja, Majdi
    Bsoul, Qusay
    Dorgham, Osama
    [J]. SOFT COMPUTING, 2021, 25 (01) : 517 - 534
  • [9] Alweshah M, 2020, J AMB INTEL HUM COMP, V11, P3405, DOI 10.1007/s12652-019-01543-4
  • [10] Alweshah M, 2016, INT J COMPUT SCI NET, V16, P77