A joint multiobjective optimization of feature selection and classifier design for high-dimensional data classification

被引:16
作者
Bai, Lixia [1 ]
Li, Hong [1 ]
Gao, Weifeng [1 ]
Xie, Jin [1 ]
Wang, Houqiang [1 ]
机构
[1] Xidian Univ, Sch Math & Stat, Xian 710126, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature selection; Classifier design; Ensemble learning; Multiobjective optimization; High-dimensional data; BINARY DIFFERENTIAL EVOLUTION; GENETIC ALGORITHM; NEURAL-NETWORKS;
D O I
10.1016/j.ins.2023.01.069
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection (FS) in data mining and machine learning has attracted extensive attention. The purpose of FS in a classification task is to find the optimal subset of features from given candidate features. Recently, more and more meta-heuristic algorithms have been used to deal with the FS problems. However, meta-heuristic algorithms suffer from certain issues, such as large search space for solutions and huge time consumption. Moreover, most of existing meta-heuristic al-gorithms focus only on the selection of an optimal feature subset, and pay little attention to the optimal design of the classifier. In this article, we propose a joint multiobjective optimization method for both feature selection and classifier design, called JMO-FSCD. The proposed approach uses neural network as a classifier and introduces a non-iterative algorithm for training the classifier so as to ensure good performance and fast learning. A new coding scheme is also designed for optimizing FS and classifier simultaneously. For demonstrating the superiority of the proposed approach, its performance is compared with those of six state-of-the-art FS algorithms. Experimental results on thirty-five benchmark data sets reflect the superior performance of the proposed JMO-FSCD.
引用
收藏
页码:457 / 473
页数:17
相关论文
共 49 条
[1]   A dynamic locality multi-objective salp swarm algorithm for feature selection [J].
Aljarah, Ibrahim ;
Habib, Maria ;
Faris, Hossam ;
Al-Madi, Nailah ;
Heidari, Ali Asghar ;
Mafarja, Majdi ;
Abd Elaziz, Mohamed ;
Mirjalili, Seyedali .
COMPUTERS & INDUSTRIAL ENGINEERING, 2020, 147
[2]  
[Anonymous], 2009, Metaheuristics: From Design to Implementation
[3]   A cooperative genetic algorithm based on extreme learning machine for data classification [J].
Bai, Lixia ;
Li, Hong ;
Gao, Weifeng ;
Xie, Jin .
SOFT COMPUTING, 2022, 26 (17) :8585-8601
[4]   Selecting useful groups of features in a connectionist framework [J].
Chakraborty, Debrup ;
Pal, Nikhil R. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2008, 19 (03) :381-396
[5]   Feature Selection Using a Neural Framework With Controlled Redundancy [J].
Chakraborty, Rudrasis ;
Pal, Nikhil R. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) :35-50
[6]   Feature selection using Binary Crow Search Algorithm with time varying flight length [J].
Chaudhuri, Abhilasha ;
Sahu, Tirath Prasad .
EXPERT SYSTEMS WITH APPLICATIONS, 2021, 168
[7]   Correlation-Guided Updating Strategy for Feature Selection in Classification With Surrogate-Assisted Particle Swarm Optimization [J].
Chen, Ke ;
Xue, Bing ;
Zhang, Mengjie ;
Zhou, Fengyu .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2022, 26 (05) :1015-1029
[8]   A fast and elitist multiobjective genetic algorithm: NSGA-II [J].
Deb, K ;
Pratap, A ;
Agarwal, S ;
Meyarivan, T .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2002, 6 (02) :182-197
[9]   An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints [J].
Deb, Kalyanmoy ;
Jain, Himanshu .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2014, 18 (04) :577-601
[10]  
Díaz-Manríquez A, 2013, 2013 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), P1523