Efficient Feature Selection Algorithm Based on Particle Swarm Optimization With Learning Memory

被引:27
作者
Wei, Bo [1 ,2 ]
Zhang, Wensheng [2 ]
Xia, Xuewen [3 ]
Zhang, Yinglong [3 ]
Yu, Fei [3 ]
Zhu, Zhiliang [1 ]
机构
[1] East China Jiaotong Univ, Sch Software, Nanchang 330013, Jiangxi, Peoples R China
[2] Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
[3] Minnan Normal Univ, Coll Phys & Informat Engn, Zhangzhou 363000, Peoples R China
基金
中国国家自然科学基金;
关键词
Combinatorial optimization; feature selection; global optimization; learning memory; particle swarm optimization; FEATURE SUBSET-SELECTION; GENETIC ALGORITHM; CLASSIFICATION; SEARCH; PERFORMANCE; RELEVANCE;
D O I
10.1109/ACCESS.2019.2953298
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection is an important pre-processing step in machine learning and data mining tasks, which improves the performance of the learning models by removing redundant and irrelevant features. Many feature selection algorithms have been widely studied, including greedy and random search approaches, to find a subset of the most important features for fulfilling a particular task (i.e., classification and regression). As a powerful swarm-based meta-heuristic method, particle swarm optimization (PSO) is reported to be suitable for optimization problems with continuous search space. However, the traditional PSO has rarely been applied to feature selection as a discrete space search problem. In this paper, a novel feature selection algorithm based on PSO with learning memory (PSO-LM) is proposed. The goal of the learning memory strategy is designed to inherit much more useful knowledge from those individuals who have higher fitness and offer faster progress, and the genetic operation is used to balance the local exploitation and the global exploration of the algorithm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method has been evaluated on some international standard data sets, and the results demonstrated its superiority compared with those wrapper-based feature selection methods.
引用
收藏
页码:166066 / 166078
页数:13
相关论文
共 41 条
[1]   Text feature selection using ant colony optimization [J].
Aghdam, Mehdi Hosseinzadeh ;
Ghasem-Aghaee, Nasser ;
Basiri, Mohammad Ehsan .
EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (03) :6843-6853
[2]  
[Anonymous], 2009, ACM SIGKDD explorations newsletter, DOI 10.1145/1656274.1656278
[3]  
[Anonymous], 2012, INT J COMPUTER APPL
[4]  
Chang JF, 2009, INT J INNOV COMPUT I, V5, P5069
[5]   Hybrid particle swarm optimization with spiral-shaped mechanism for feature selection [J].
Chen, Ke ;
Zhou, Feng-Yu ;
Yuan, Xian-Feng .
EXPERT SYSTEMS WITH APPLICATIONS, 2019, 128 :140-156
[6]   Gene selection and classification using Taguchi chaotic binary particle swarm optimization [J].
Chuang, Li-Yeh ;
Yang, Cheng-San ;
Wu, Kuo-Chuan ;
Yang, Cheng-Hong .
EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (10) :13367-13377
[7]   Improved binary particle swarm optimization using catfish effect for feature selection [J].
Chuang, Li-Yeh ;
Tsai, Sheng-Wei ;
Yang, Cheng-Hong .
EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (10) :12699-12707
[8]   Normalized Mutual Information Feature Selection [J].
Estevez, Pablo. A. ;
Tesmer, Michel ;
Perez, Claudio A. ;
Zurada, Jacek A. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (02) :189-201
[9]   An efficient binary Salp Swarm Algorithm with crossover scheme for feature selection problems [J].
Faris, Hossam ;
Mafarja, Majdi M. ;
Heidari, Ali Asghar ;
Aljarah, Ibrahim ;
Al-Zoubi, Ala' M. ;
Mirjalili, Seyedali ;
Fujita, Hamido .
KNOWLEDGE-BASED SYSTEMS, 2018, 154 :43-67
[10]   Forest Optimization Algorithm [J].
Ghaemi, Manizheh ;
Feizi-Derakhshi, Mohammad-Reza .
EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (15) :6676-6687