A Hybrid Binary Dragonfly Algorithm with an Adaptive Directed Differential Operator for Feature Selection

被引:3
作者
Chen, Yilin [1 ,2 ,3 ]
Gao, Bo [1 ]
Lu, Tao [1 ,2 ]
Li, Hui [1 ,2 ]
Wu, Yiqi [4 ]
Zhang, Dejun [4 ]
Liao, Xiangyun [5 ]
机构
[1] Wuhan Inst Technol, Sch Comp Sci & Engn, Wuhan 430073, Peoples R China
[2] Wuhan Inst Technol, Hubei Key Lab Intelligent Robot, Wuhan 430073, Peoples R China
[3] Wuhan Inst Technol, Hubei Engn Res Ctr Intelligent Prod Line Equipment, Wuhan 430073, Peoples R China
[4] China Univ Geosci, Sch Comp Sci, Wuhan 430073, Peoples R China
[5] Chinese Acad Sci, Shenzhen Inst Adv Technol, Shenzhen 518052, Peoples R China
关键词
feature selection; binary dragonfly algorithm; differential evolution algorithm; multiobjective optimization; classification; PARTICLE SWARM OPTIMIZATION; GREY WOLF OPTIMIZATION; EVOLUTION; SEARCH;
D O I
10.3390/rs15163980
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Feature selection is a typical multiobjective problem including two conflicting objectives. In classification, feature selection aims to improve or maintain classification accuracy while reducing the number of selected features. In practical applications, feature selection is one of the most important tasks in remote sensing image classification. In recent years, many metaheuristic algorithms have attempted to explore feature selection, such as the dragonfly algorithm (DA). Dragonfly algorithms have a powerful search capability that achieves good results, but there are still some shortcomings, specifically that the algorithm's ability to explore will be weakened in the late phase, the diversity of the populations is not sufficient, and the convergence speed is slow. To overcome these shortcomings, we propose an improved dragonfly algorithm combined with a directed differential operator, called BDA-DDO. First, to enhance the exploration capability of DA in the later stages, we present an adaptive step-updating mechanism where the dragonfly step size decreases with iteration. Second, to speed up the convergence of the DA algorithm, we designed a new differential operator. We constructed a directed differential operator that can provide a promising direction for the search, then sped up the convergence. Third, we also designed an adaptive paradigm to update the directed differential operator to improve the diversity of the populations. The proposed method was tested on 14 mainstream public UCI datasets. The experimental results were compared with seven representative feature selection methods, including the DA variant algorithms, and the results show that the proposed algorithm outperformed the other representative and state-of-the-art DA variant algorithms in terms of both convergence speed and solution quality.
引用
收藏
页数:28
相关论文
共 51 条
[1]  
Alhasan W.M., 2011, P 2011 7 INT COMPUTE, P37
[2]   A New Hybrid Algorithm Based on Grey Wolf Optimization and Crow Search Algorithm for Unconstrained Function Optimization and Feature Selection [J].
Arora, Sankalap ;
Singh, Harpreet ;
Sharma, Manik ;
Sharma, Sanjeev ;
Anand, Priyanka .
IEEE ACCESS, 2019, 7 :26343-26361
[3]   Hybrid Binary Dragonfly Algorithm with Simulated Annealing for Feature Selection [J].
Chantar H. ;
Tubishat M. ;
Essgaer M. ;
Mirjalili S. .
SN Computer Science, 2021, 2 (4)
[4]   Feature selection using Binary Crow Search Algorithm with time varying flight length [J].
Chaudhuri, Abhilasha ;
Sahu, Tirath Prasad .
EXPERT SYSTEMS WITH APPLICATIONS, 2021, 168
[5]  
Chen K, 2020, IEEE C EVOL COMPUTAT, P1, DOI DOI 10.1109/cec48606.2020.9185533
[6]   A Variable Granularity Search-Based Multiobjective Feature Selection Algorithm for High-Dimensional Data Classification [J].
Cheng, Fan ;
Cui, Junjie ;
Wang, Qijun ;
Zhang, Lei .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2023, 27 (02) :266-280
[7]   Differential Evolution Using a Neighborhood-Based Mutation Operator [J].
Das, Swagatam ;
Abraham, Ajith ;
Chakraborty, Uday K. ;
Konar, Amit .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2009, 13 (03) :526-553
[8]   A fast and elitist multiobjective genetic algorithm: NSGA-II [J].
Deb, K ;
Pratap, A ;
Agarwal, S ;
Meyarivan, T .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2002, 6 (02) :182-197
[9]   BEPO: A novel binary emperor penguin optimizer for automatic feature selection [J].
Dhiman, Gaurav ;
Oliva, Diego ;
Kaur, Amandeep ;
Singh, Krishna Kant ;
Vimal, S. ;
Sharma, Ashutosh ;
Cengiz, Korhan .
KNOWLEDGE-BASED SYSTEMS, 2021, 211
[10]  
Dua D., 2017, UCI MACHINE LEARNING