共 59 条
Single-objective and multi-objective mixed-variable grey wolf optimizer for joint feature selection and classifier parameter tuning
被引:2
作者:
Li, Hongjuan
[1
]
Kang, Hui
[1
,2
]
Li, Jiahui
[1
]
Pang, Yanyun
[1
]
Sun, Geng
[1
,2
]
Liang, Shuang
[3
]
机构:
[1] Jilin Univ, Coll Comp Sci & Technol, Changchun 130012, Peoples R China
[2] Jilin Univ, Key Lab Symbol Computat & Knowledge Engn, Minist Educ, Changchun 130012, Peoples R China
[3] Northeast Normal Univ, Sch Informat Sci & Technol, Changchun 130117, Peoples R China
基金:
中国国家自然科学基金;
关键词:
Classification;
Feature selection;
Mixed-variable problem;
Single-objective optimization;
Multi-objective optimization;
PARTICLE SWARM OPTIMIZATION;
GENETIC ALGORITHM;
D O I:
10.1016/j.asoc.2024.112121
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
Feature selection plays an essential role in data preprocessing, which can extract valuable information from extensive data, thereby enhancing the performance of machine learning classification. However, existing feature selection methods primarily focus on selecting feature subsets without considering the impact of classifier parameters on the optimal subset. Different from these works, this paper considers jointly optimizing the feature subset and classifier parameters to minimize the number of features and achieve a low classification error rate. Since feature selection is an optimization problem with binary solution space while classifier parameters involve both continuous and discrete variables, our formulated problem becomes a complex multi- objective mixed-variable problem. To address this challenge, we consider a single-objective optimization method and a multi-objective optimization approach. Specifically, in the single-objective optimization method, we adopt the linear weight method to convert our multiple objectives into a fitness function and then propose a mixed-variable grey wolf optimizer (MGWO) to optimize the function. The proposed MGWO introduces Chaos-Faure initialization, Log convergence factor adjustment, and optimal solution adaptive update operators to enhance its adaptability and balance the global and local search of the algorithm. Subsequently, an improved multi-objective grey wolf optimizer (IMOGWO) is introduced to directly address the problem. The proposed IMOGWO introduces improved initialization, local search, and binary variable mutation operators to balance its exploration and exploitation abilities, making it more suitable for our mixed-variable problem. Extensive simulation results show that our MGWO and IMOGWO outperform recent and classic baselines. Moreover, we also find that jointly optimizing classifier parameters can significantly improve classification accuracy.
引用
收藏
页数:31
相关论文