Single-objective and multi-objective mixed-variable grey wolf optimizer for joint feature selection and classifier parameter tuning

被引:2
作者
Li, Hongjuan [1 ]
Kang, Hui [1 ,2 ]
Li, Jiahui [1 ]
Pang, Yanyun [1 ]
Sun, Geng [1 ,2 ]
Liang, Shuang [3 ]
机构
[1] Jilin Univ, Coll Comp Sci & Technol, Changchun 130012, Peoples R China
[2] Jilin Univ, Key Lab Symbol Computat & Knowledge Engn, Minist Educ, Changchun 130012, Peoples R China
[3] Northeast Normal Univ, Sch Informat Sci & Technol, Changchun 130117, Peoples R China
基金
中国国家自然科学基金;
关键词
Classification; Feature selection; Mixed-variable problem; Single-objective optimization; Multi-objective optimization; PARTICLE SWARM OPTIMIZATION; GENETIC ALGORITHM;
D O I
10.1016/j.asoc.2024.112121
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection plays an essential role in data preprocessing, which can extract valuable information from extensive data, thereby enhancing the performance of machine learning classification. However, existing feature selection methods primarily focus on selecting feature subsets without considering the impact of classifier parameters on the optimal subset. Different from these works, this paper considers jointly optimizing the feature subset and classifier parameters to minimize the number of features and achieve a low classification error rate. Since feature selection is an optimization problem with binary solution space while classifier parameters involve both continuous and discrete variables, our formulated problem becomes a complex multi- objective mixed-variable problem. To address this challenge, we consider a single-objective optimization method and a multi-objective optimization approach. Specifically, in the single-objective optimization method, we adopt the linear weight method to convert our multiple objectives into a fitness function and then propose a mixed-variable grey wolf optimizer (MGWO) to optimize the function. The proposed MGWO introduces Chaos-Faure initialization, Log convergence factor adjustment, and optimal solution adaptive update operators to enhance its adaptability and balance the global and local search of the algorithm. Subsequently, an improved multi-objective grey wolf optimizer (IMOGWO) is introduced to directly address the problem. The proposed IMOGWO introduces improved initialization, local search, and binary variable mutation operators to balance its exploration and exploitation abilities, making it more suitable for our mixed-variable problem. Extensive simulation results show that our MGWO and IMOGWO outperform recent and classic baselines. Moreover, we also find that jointly optimizing classifier parameters can significantly improve classification accuracy.
引用
收藏
页数:31
相关论文
共 59 条
[1]   Behavior-based ransomware classification: A particle swarm optimization wrapper-based approach for feature selection [J].
Abbasi, Muhammad Shabbir ;
Al-Sahaf, Harith ;
Mansoori, Masood ;
Welch, Ian .
APPLIED SOFT COMPUTING, 2022, 121
[2]   Copula entropy-based golden jackal optimization algorithm for high-dimensional feature selection problems [J].
Askr, Heba ;
Abdel-Salam, Mahmoud ;
Hassanien, Aboul Ella .
EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
[3]   A new intelligent diagnosis system for the heart valve diseases by using genetic-SVM classifier [J].
Avci, E. .
EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (07) :10618-10626
[5]   Colloquium: Machine learning in nuclear physics [J].
Boehnlein, Amber ;
Diefenthaler, Markus ;
Sato, Nobuo ;
Schram, Malachi ;
Ziegler, Veronique ;
Fanelli, Cristiano ;
Hjorth-Jensen, Morten ;
Horn, Tanja ;
Kuchera, Michelle P. ;
Lee, Dean ;
Nazarewicz, Witold ;
Ostroumov, Peter ;
Orginos, Kostas ;
Poon, Alan ;
Wang, Xin-Nian ;
Scheinker, Alexander ;
Smith, Michael S. ;
Pang, Long-Gang .
REVIEWS OF MODERN PHYSICS, 2022, 94 (03)
[6]   Advancement of the search process of salp swarm algorithm for global optimization problems [J].
celik, Emre ;
Ozturk, Nihat ;
Arya, Yogendra .
EXPERT SYSTEMS WITH APPLICATIONS, 2021, 182
[7]   A Hybrid Binary Dragonfly Algorithm with an Adaptive Directed Differential Operator for Feature Selection [J].
Chen, Yilin ;
Gao, Bo ;
Lu, Tao ;
Li, Hui ;
Wu, Yiqi ;
Zhang, Dejun ;
Liao, Xiangyun .
REMOTE SENSING, 2023, 15 (16)
[8]   Diagnosing Alzheimer's disease from on-line handwriting: A novel dataset and performance benchmarking [J].
Cilia, Nicole D. ;
De Gregorio, Giuseppe ;
De Stefano, Claudio ;
Fontanella, Francesco ;
Marcelli, Angelo ;
Parziale, Antonio .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022, 111
[9]   A Jaya algorithm based wrapper method for optimal feature selection in supervised classification [J].
Das, Himansu ;
Naik, Bighnaraj ;
Behera, H. S. .
JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2022, 34 (06) :3851-3863
[10]   A fast and elitist multiobjective genetic algorithm: NSGA-II [J].
Deb, K ;
Pratap, A ;
Agarwal, S ;
Meyarivan, T .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2002, 6 (02) :182-197