Pareto front feature selection based on artificial bee colony optimization

被引:244
作者
Hancer, Emrah [1 ,2 ]
Xue, Bing [3 ]
Zhang, Mengjie [3 ]
Karaboga, Dervis [1 ]
Akay, Bahriye [1 ]
机构
[1] Mehmet Akif Ersoy Univ, Dept Comp Technol & Informat Syst, TR-15030 Burdur, Turkey
[2] Erciyes Univ, Dept Comp Engn, TR-38039 Kayseri, Turkey
[3] Victoria Univ Wellington, Sch Engn & Comp Sci, POB 600, Wellington 6140, New Zealand
关键词
Feature selection; Classification; Multi-objective optimization; Artificial bee colony; PARTICLE SWARM OPTIMIZATION; EVOLUTIONARY ALGORITHM; MUTUAL INFORMATION; CLASSIFICATION;
D O I
10.1016/j.ins.2017.09.028
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection has two major conflicting aims, i.e., to maximize the classification performance and to minimize the number of selected features to overcome the curse of dimensionality. To balance their trade-off, feature selection can be handled as a multi-objective problem. In this paper, a feature selection approach is proposed based on a new multi objective artificial bee colony algorithm integrated with non-dominated sorting procedure and genetic operators. Two different implementations of the proposed approach are developed: ABC with binary representation and ABC with continuous representation. Their performance are examined on 12 benchmark datasets and the results are compared with those of linear forward selection, greedy stepwise backward selection, two single objective ABC algorithms and three well-known multi-objective evolutionary computation algorithms. The results show that the proposed approach with the binary representation outperformed the other methods in terms of both the dimensionality reduction and the classification accuracy. (C) 2017 Elsevier Inc. All rights reserved.
引用
收藏
页码:462 / 479
页数:18
相关论文
共 48 条
  • [21] A comprehensive survey: artificial bee colony (ABC) algorithm and applications
    Karaboga, Dervis
    Gorkemli, Beyza
    Ozturk, Celal
    Karaboga, Nurhan
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2014, 42 (01) : 21 - 57
  • [22] Kira K, 1992, P 9 INT WORKSH MACH, P249
  • [23] Wrappers for feature subset selection
    Kohavi, R
    John, GH
    [J]. ARTIFICIAL INTELLIGENCE, 1997, 97 (1-2) : 273 - 324
  • [24] Input feature selection for classification problems
    Kwak, N
    Choi, CH
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (01): : 143 - 159
  • [25] Honey, I shrunk the sample covariance matrix - Problems in mean-variance optimization.
    Ledoit, O
    Wolf, M
    [J]. JOURNAL OF PORTFOLIO MANAGEMENT, 2004, 30 (04) : 110 - +
  • [26] Liagkouras K., 2013, P 22 INT C COMP COMM, P1
  • [27] LICHMAN M., 2013, UCI MACHINE LEARNING
  • [28] Toward integrating feature selection algorithms for classification and clustering
    Liu, H
    Yu, L
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2005, 17 (04) : 491 - 502
  • [29] An Improved Particle Swarm Optimization for Feature Selection
    Liu, Yuanning
    Wang, Gang
    Chen, Huiling
    Dong, Hao
    Zhu, Xiaodong
    Wang, Sujing
    [J]. JOURNAL OF BIONIC ENGINEERING, 2011, 8 (02) : 191 - 200
  • [30] ON EFFECTIVENESS OF RECEPTORS IN RECOGNITION SYSTEMS
    MARILL, T
    GREEN, DM
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 1963, 9 (01) : 11 - &