Hybrid particle swarm optimization with spiral-shaped mechanism for feature selection

被引:137
|
作者
Chen, Ke [1 ]
Zhou, Feng-Yu [1 ]
Yuan, Xian-Feng [2 ]
机构
[1] Shandong Univ, Sch Control Sci & Engn, Jinan 250061, Shandong, Peoples R China
[2] Shandong Univ, Sch Mech Elect & Informat Engn, Weihai 264209, Peoples R China
基金
中国国家自然科学基金;
关键词
Particle swarm optimization; Feature selection; Classification; Optimization; ANT COLONY OPTIMIZATION; UNSUPERVISED FEATURE-SELECTION; INTEGRATING FEATURE-SELECTION; REDUNDANCY FEATURE-SELECTION; FEATURE SUBSET-SELECTION; GENE SELECTION; DIFFERENTIAL EVOLUTION; KRILL HERD; ALGORITHM; CLASSIFICATION;
D O I
10.1016/j.eswa.2019.03.039
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The "curse of dimensionality" is one of the largest problems that influences the quality of the optimization process in most data mining, pattern recognition, and machine learning tasks. Using high dimensional datasets to train a classification model may reduce the generalization performance of the learned model. In addition, high dimensionality of the dataset results in high computational and memory costs. Feature selection is an important data preprocessing approach in many practical application domains that are relevant to expert and intelligent systems. Feature selection aims at selecting a subset of informative and relevant features from an original feature dataset. Therefore, using a feature selection approach to process the original data prior to the learning process is essential for enhancing the performance on the classification task. In this paper, hybrid particle swarm optimization with a spiral shaped mechanism (HPSO-SSM) is proposed for selecting the optimal feature subset for classification via a wrapper-based approach. In HPSO-SSM, we make three improvements: First, a logistic map sequence is used to enhance the diversity in the search process. Second, two new parameters are introduced into the original position update formula, which can effectively improve the position quality of the next generation. Finally, a spiral-shaped mechanism is adopted as a local search operator around the known optimal solution region. For a complete evaluation, the proposed HPSO-SSM method is compared with six state-of-the-art meta-heuristic optimization algorithms, ten well-known wrapper-based feature selection techniques, and six classic filter-based feature selection methods. Various assessment indicators are used to properly evaluate and compare the performances of these approaches on twenty classic benchmark classification datasets from the UCI machine learning repository. According to the experimental results and statistical tests, the developed methods effectively and efficiently improve the classification accuracy compared with other wrapper-based approaches and filter-based approaches. The results demonstrate the high performance of the HPSO-SSM method in searching the feasible feature space and selecting the most informative attributes for solving classification problems. Therefore, the HPSO-SSM method has broad application prospects as a new feature selection approach. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页码:140 / 156
页数:17
相关论文
共 50 条
  • [41] A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy
    Moradi, Parham
    Gholampour, Mozhgan
    APPLIED SOFT COMPUTING, 2016, 43 : 117 - 130
  • [42] A Stable Hybrid Method for Feature Subset Selection using Particle Swarm Optimization with Local Search
    Dhrif, Hassen
    Giraldo, Luis G. S.
    Kubat, Miroslav
    Wuchty, Stefan
    PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, : 13 - 21
  • [43] A maximum relevance minimum redundancy hybrid feature selection algorithm based on particle swarm optimization
    Yao, Xu
    Wang, Xiao-Dan
    Zhang, Yu-Xi
    Quan, Wen
    Kongzhi yu Juece/Control and Decision, 2013, 28 (03): : 413 - 417
  • [44] Hybrid approach of improved binary particle swarm optimization and shuffled frog leaping for feature selection
    Rajamohana, S. P.
    Umamaheswari, K.
    COMPUTERS & ELECTRICAL ENGINEERING, 2018, 67 : 497 - 508
  • [45] HYBRID BINARY DRAGONFLY ENHANCED PARTICLE SWARM OPTIMIZATION ALGORITHM FOR SOLVING FEATURE SELECTION PROBLEMS
    Tawhid, Mohamed A.
    Dsouza, Kevin B.
    MATHEMATICAL FOUNDATIONS OF COMPUTING, 2018, 1 (02): : 181 - 200
  • [46] Improved salp swarm algorithm based on particle swarm optimization for feature selection
    Ibrahim, Rehab Ali
    Ewees, Ahmed A.
    Oliva, Diego
    Abd Elaziz, Mohamed
    Lu, Songfeng
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2019, 10 (08) : 3155 - 3169
  • [47] Improved salp swarm algorithm based on particle swarm optimization for feature selection
    Rehab Ali Ibrahim
    Ahmed A. Ewees
    Diego Oliva
    Mohamed Abd Elaziz
    Songfeng Lu
    Journal of Ambient Intelligence and Humanized Computing, 2019, 10 : 3155 - 3169
  • [48] Chaotic Maps in Binary Particle Swarm Optimization for Feature Selection
    Yang, Cheng-San
    Chuang, Li-Yeh
    Li, Jung-Chike
    Yang, Cheng-Hong
    2008 IEEE CONFERENCE ON SOFT COMPUTING IN INDUSTRIAL APPLICATIONS SMCIA/08, 2009, : 107 - +
  • [49] A Tunable Particle Swarm Size Optimization Algorithm for Feature Selection
    Mallenahalli, Naresh
    Sarma, T. Hitendra
    2018 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2018, : 134 - 140
  • [50] Simultaneous Feature Selection and Clustering Using Particle Swarm Optimization
    Swetha, K. P.
    Devi, V. Susheela
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT I, 2012, 7663 : 509 - 515