A Forward Search Inspired Particle Swarm Optimization Algorithm for Feature Selection in Classification

被引:2
作者
Li, An-Da [1 ]
Xue, Bing [2 ]
Zhang, Mengjie [2 ]
机构
[1] Tianjin Univ Commerce, Sch Management, Tianjin, Peoples R China
[2] Victoria Univ Wellington, Sch Engn & Comp Sci, Wellington, New Zealand
来源
2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021) | 2021年
基金
中国国家自然科学基金;
关键词
Particle swarm optimization; feature selection; classification; sequential forward selection;
D O I
10.1109/CEC45853.2021.9504949
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Particle swarm optimization (PSO) has been widely used for feature selection (FS) in classification. However, FS is still a challenging optimization task for PSO when the dimensionality of data is high. In this paper, we propose a forward search inspired PSO (FSIPSO) algorithm to build a wrapper-based FS method. In FSIPSO, the search space dynamically changes during the evolutionary process. Specifically, we rank the features according to their single-feature classification performance and divide the search space into several sub-spaces. A forward search scheme is proposed to sequentially select the sub-spaces. The selected sub-spaces construct the search space for FSIPSO. With this scheme, FSIPSO first searches in a small space to quickly find candidate solutions (feature subsets) with relatively good performance. Then, the search space expands with the selection of more sub-spaces, and FSIPSO can further select informative features in the expanded search space. Moreover, mutation operations are used in FSIPSO to avoid the premature problem. The experimental results on 8 UCI datasets have shown that FSIPSO obtains better FS results with less computation time compared with benchmark PSO-based FS methods. FSIPSO also obtains better convergence performance than these methods.
引用
收藏
页码:786 / 793
页数:8
相关论文
共 30 条
[1]  
AHA DW, 1991, MACH LEARN, V6, P37, DOI 10.1007/BF00153759
[2]  
Andreae P., 2019, IEEE T CYBERNETICS, P1
[3]   A Novel Binary Particle Swarm Optimization Algorithm and Its Applications on Knapsack and Feature Selection Problems [J].
Bach Hoai Nguyen ;
Xue, Bing ;
Andreae, Peter .
INTELLIGENT AND EVOLUTIONARY SYSTEMS, IES 2016, 2017, 8 :319-332
[4]   Adaptive Multi-Subswarm Optimisation for Feature Selection on High-Dimensional Classification [J].
Binh Tran ;
Xue, Bing ;
Zhang, Mengjie .
PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, :481-489
[5]   A review of feature selection methods on synthetic data [J].
Bolon-Canedo, Veronica ;
Sanchez-Marono, Noelia ;
Alonso-Betanzos, Amparo .
KNOWLEDGE AND INFORMATION SYSTEMS, 2013, 34 (03) :483-519
[6]   A Competitive Swarm Optimizer for Large Scale Optimization [J].
Cheng, Ran ;
Jin, Yaochu .
IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (02) :191-204
[7]  
DONG HB, 2020, IEEE C EVOL COMPUTAT, pNI184, DOI DOI 10.1109/cec48606.2020.9185635
[8]  
Dua D., 2017, UCI machine learning repository
[9]   Feature selection for high-dimensional classification using a competitive swarm optimizer [J].
Gu, Shenkai ;
Cheng, Ran ;
Jin, Yaochu .
SOFT COMPUTING, 2018, 22 (03) :811-822
[10]   Large-scale attribute selection using wrappers [J].
Guetlein, Martin ;
Frank, Eibe ;
Hall, Mark ;
Karwath, Andreas .
2009 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DATA MINING, 2009, :332-339