A New Representation in PSO for Discretization-Based Feature Selection

被引:165
作者
Tran, Binh [1 ]
Xue, Bing [1 ]
Zhang, Mengjie [1 ]
机构
[1] Victoria Univ Wellington, Evolutionary Computat Res Grp, Wellington, New Zealand
关键词
Classification; discretization; feature selection (FS); high-dimensional data; particle swarm optimization (PSO); PARTICLE SWARM OPTIMIZATION; CLASSIFICATION; ALGORITHM;
D O I
10.1109/TCYB.2017.2714145
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In machine learning, discretization and feature selection (FS) are important techniques for preprocessing data to improve the performance of an algorithm on high-dimensional data. Since many FS methods require discrete data, a common practice is to apply discretization before FS. In addition, for the sake of efficiency, features are usually discretized individually (or univariate). This scheme works based on the assumption that each feature independently influences the task, which may not hold in cases where feature interactions exist. Therefore, univariate discretization may degrade the performance of the FS stage since information showing feature interactions may be lost during the discretization process. Initial results of our previous proposed method [evolve particle swarm optimization (EPSO)] showed that combining discretization and FS in a single stage using bare-bones particle swarm optimization (BBPSO) can lead to a better performance than applying them in two separate stages. In this paper, we propose a new method called potential particle swarm optimization (PPSO) which employs a new representation that can reduce the search space of the problem and a new fitness function to better evaluate candidate solutions to guide the search. The results on ten high-dimensional datasets show that PPSO select less than 5% of the number of features for all datasets. Compared with the two-stage approach which uses BBPSO for FS on the discretized data, PPSO achieves significantly higher accuracy on seven datasets. In addition, PPSO obtains better (or similar) classification performance than EPSO on eight datasets with a smaller number of selected features on six datasets. Furthermore, PPSO also outperforms the three compared (traditional) methods and performs similar to one method on most datasets in terms of both generalization ability and learning capacity.
引用
收藏
页码:1733 / 1746
页数:14
相关论文
共 60 条
[1]  
Al-Sahaf H, 2015, IEEE C EVOL COMPUTAT, P2460, DOI 10.1109/CEC.2015.7257190
[2]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[3]  
[Anonymous], 2009, SIGKDD Explorations, DOI DOI 10.1145/1656274.1656278
[4]  
[Anonymous], 1997, MACHINE LEARNING, MCGRAW-HILL SCIENCE/ENGINEERING/MATH
[5]   Bare-Bone Particle Swarm Optimisation for Simultaneously Discretising and Selecting Features for High-Dimensional Classification [J].
Binh Tran ;
Xue, Bing ;
Zhang, Mengjie .
APPLICATIONS OF EVOLUTIONARY COMPUTATION, EVOAPPLICATIONS 2016, PT I, 2016, 9597 :701-718
[6]   Overview of particle swarm optimisation for feature selection in classification [J].
Tran, Binh ;
Xue, Bing ;
Zhang, Mengjie .
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2014, 8886 :605-617
[7]  
Bolon-Canedo V., 2010, 2010 INT JOINT C NEU, P1, DOI 10.1109/IJCNN.2010.5596825
[8]  
Butler-Yeoman T, 2015, IEEE C EVOL COMPUTAT, P2428, DOI 10.1109/CEC.2015.7257186
[9]   ur-CAIM: improved CAIM discretization for unbalanced and balanced data [J].
Cano, Alberto ;
Nguyen, Dat T. ;
Ventura, Sebastian ;
Cios, Krzysztof J. .
SOFT COMPUTING, 2016, 20 (01) :173-188
[10]  
CATLETT J, 1991, LECT NOTES ARTIF INT, V482, P164, DOI 10.1007/BFb0017012