Multi-objective feature selection algorithm using Beluga Whale Optimization

被引:0
作者
Esfahani, Kiana Kouhpah [1 ]
Zade, Behnam Mohammad Hasani [1 ]
Mansouri, Najme [1 ]
机构
[1] Shahid Bahonar Univ Kerman, Dept Comp Sci, Kerman, Iran
关键词
Feature selection; Beluga Whale Optimization; Multi-objective optimization; Random forest; PARTICLE SWARM OPTIMIZATION; CLASSIFICATION;
D O I
10.1016/j.chemolab.2024.105295
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The advancement of science and technology has resulted in large datasets with noisy or redundant features that hamper classification. In feature selection, relevant attributes are selected to reduce dimensionality, thereby improving classification accuracy. Multi-objective optimization is crucial in feature selection because it allows simultaneous evaluation of multiple, often conflicting objectives, such as maximizing model accuracy and minimizing the number of features. Traditional single-objective methods might focus solely on accuracy, often leading to models that are complex and computationally expensive. Multi-objective optimization, on the other hand, considers trade-offs between different criteria, identifying a set of optimal solutions (a Pareto front) where no one solution is clearly superior. It is especially useful when analyzing high-dimensional datasets, as it reduces overfitting and enhances model performance by selecting the most informative subset of features. This article introduces and evaluates the performance of the Binary version of Beluga Whale Optimization and the MultiObjective Beluga Whale Optimization (MOBWO) algorithm in the context of feature selection. Features are encoded as binary matrices to denote their presence or absence, making it easier to stratify datasets. MOBWO emulates the exploration and exploitation patterns of Beluga Whale Optimization (BWO) through continuous search space. Optimal classification accuracy and minimum feature subset size are two conflicting objectives. The MOBWO was compared using 12 datasets from the University of California Irvine (UCI) repository with eleven well-known optimization algorithms, such as Genetic Algorithm (GA), Sine Cosine Algorithm (SCA), Bat Optimization Algorithm (BOA), Differential Evolution (DE), Whale Optimization Algorithm (WOA), Nondominated Sorting Genetic Algorithm II (NSGA-II), Multi-Objective Particle Swarm Optimization (MOPSO), Multi-Objective Grey Wolf Optimizer (MOGWO), Multi-Objective Grasshopper Optimization Algorithm (MOGOA), Multi-Objective Non-dominated advanced Butterfly Optimization Algorithm (MONSBOA), and MultiObjective Slime Mould Algorithm (MOSMA). In experiments using Random Forest (RF) as the classifier, different performance metrics were evaluated. The computational results show that the proposed BBWO algorithm achieves an average accuracy rate of 99.06 % across 12 datasets. Additionally, the proposed MOBWO algorithm outperforms existing multi-objective feature selection methods on all 12 datasets based on three metrics: Success Counting (SCC), Inverted Generational Distance (IGD), and Hypervolume indicators (HV). For instance, MOBWO achieves an average HV that is at least 3.54 % higher than all other methods.
引用
收藏
页数:28
相关论文
共 50 条
  • [41] The no-free-lunch theorems of supervised learning
    Sterkenburg, Tom F.
    Grunwald, Peter D.
    [J]. SYNTHESE, 2021, 199 (3-4) : 9979 - 10015
  • [42] A new and fast rival genetic algorithm for feature selection
    Too, Jingwei
    Abdullah, Abdul Rahim
    [J]. JOURNAL OF SUPERCOMPUTING, 2021, 77 (03) : 2844 - 2874
  • [43] Hybrid Binary Particle Swarm Optimization Differential Evolution-Based Feature Selection for EMG Signals Classification
    Too, Jingwei
    Abdullah, Abdul Rahim
    Saad, Norhashimah Mohd
    [J]. AXIOMS, 2019, 8 (03)
  • [44] MIC-SHAP: An ensemble feature selection method for materials machine learning
    Wang, Junya
    Xu, Pengcheng
    Ji, Xiaobo
    Li, Minjie
    Lu, Wencong
    [J]. MATERIALS TODAY COMMUNICATIONS, 2023, 37
  • [45] Feature Selection Using Diversity-Based Multi-objective Binary Differential Evolution
    Wang, Peng
    Xue, Bing
    Liang, Jing
    Zhang, Mengjie
    [J]. INFORMATION SCIENCES, 2023, 626 : 586 - 606
  • [46] Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective Approach
    Xue, Bing
    Zhang, Mengjie
    Browne, Will N.
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (06) : 1656 - 1671
  • [47] A multi-objective particle swarm optimisation for filter-based feature selection in classification problems
    Xue, Bing
    Cervante, Liam
    Shang, Lin
    Browne, Will N.
    Zhang, Mengjie
    [J]. CONNECTION SCIENCE, 2012, 24 (2-3) : 91 - 116
  • [48] An external attention-based feature ranker for large-scale feature selection
    Xue, Yu
    Zhang, Chenyi
    Neri, Ferrante
    Gabbouj, Moncef
    Zhang, Yong
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 281
  • [49] The application of mutual information-based feature selection and fuzzy LS-SVM-based classifier in motion classification
    Yan, Zhiguo
    Wang, Zhizhong
    Xie, Hongbo
    [J]. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2008, 90 (03) : 275 - 284
  • [50] Beluga whale optimization: A novel nature-inspired metaheuristic algorithm
    Zhong, Changting
    Li, Gang
    Meng, Zeng
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 251