Adaptive Multi-Subswarm Optimisation for Feature Selection on High-Dimensional Classification

被引:18
作者
Binh Tran [1 ,2 ]
Xue, Bing [1 ]
Zhang, Mengjie [1 ]
机构
[1] Victoria Univ Wellington, Sch Engn & Comp Sci, Wellington, New Zealand
[2] Can Tho Univ, Can Tho, Vietnam
来源
PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19) | 2019年
关键词
Feature selection; Particle swarm optimisation; High-dimensional data; Classification; PARTICLE SWARM OPTIMIZATION;
D O I
10.1145/3321707.3321713
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature space is an important factor influencing the performance of any machine learning algorithm including classification methods. Feature selection aims to remove irrelevant and redundant features that may negatively affect the learning process especially on high-dimensional data, which usually suffers from the curse of dimensionality. Feature ranking is one of the most scalable feature selection approaches to high-dimensional problems, but most of them fail to automatically determine the number of selected features as well as detect redundancy between features. Particle swarm optimisation (PSO) is a population-based algorithm which has shown to be effective in addressing these limitations. However, its performance on high-dimensional data is still limited due to the large search space and high computation cost. This study proposes the first adaptive multi-swarm optimisation (AMSO) method for feature selection that can automatically select a feature subset of high-dimensional data more effectively and efficiently than the compared methods. The subswarms are automatically and dynamically changed based on their performance during the evolutionary process. Experiments on ten high-dimensional datasets of varying difficulties have shown that AMSO is more effective and more efficient than the compared PSO-based and traditional feature selection methods in most cases.
引用
收藏
页码:481 / 489
页数:9
相关论文
共 22 条
[1]  
Aggarwal CC, 2001, LECT NOTES COMPUT SC, V1973, P420
[2]   Isolation and distinctiveness in the design of e-learning systems influence user preferences [J].
Al-Samarraie, Hosam ;
Selim, Hassan ;
Teo, Timothy ;
Zaqout, Fahed .
INTERACTIVE LEARNING ENVIRONMENTS, 2017, 25 (04) :452-466
[3]   A Hamming distance based binary particle swarm optimization (HDBPSO) algorithm for high dimensional feature selection, classification and validation [J].
Banka, Haider ;
Dara, Suresh .
PATTERN RECOGNITION LETTERS, 2015, 52 :94-100
[4]  
Tran B, 2016, IEEE C EVOL COMPUTAT, P3801, DOI 10.1109/CEC.2016.7744271
[5]   A Competitive Swarm Optimizer for Large Scale Optimization [J].
Cheng, Ran ;
Jin, Yaochu .
IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (02) :191-204
[6]   Improved binary particle swarm optimization using catfish effect for feature selection [J].
Chuang, Li-Yeh ;
Tsai, Sheng-Wei ;
Yang, Cheng-Hong .
EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (10) :12699-12707
[7]   Minimum redundancy feature selection from microarray gene expression data [J].
Ding, C ;
Peng, HC .
PROCEEDINGS OF THE 2003 IEEE BIOINFORMATICS CONFERENCE, 2003, :523-528
[8]   Research on collaborative negotiation for e-commerce. [J].
Feng, YQ ;
Lei, Y ;
Li, Y ;
Cao, RZ .
2003 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-5, PROCEEDINGS, 2003, :2085-2088
[9]   Efficient feature selection filters for high-dimensional data [J].
Ferreira, Artur J. ;
Figueiredo, Mario A. T. .
PATTERN RECOGNITION LETTERS, 2012, 33 (13) :1794-1804
[10]   Feature selection for high-dimensional classification using a competitive swarm optimizer [J].
Gu, Shenkai ;
Cheng, Ran ;
Jin, Yaochu .
SOFT COMPUTING, 2018, 22 (03) :811-822