Particle swarm optimization based feature selection using factorial design

被引:1
作者
Kocak, Emre [1 ,2 ]
Orkcu, Haci Hasan [1 ]
机构
[1] Gazi Univ, Fac Sci, Dept Stat, Ankara, Turkiye
[2] Gazi Univ, Grad Sch Nat & Appl Sci, Ankara, Turkiye
来源
HACETTEPE JOURNAL OF MATHEMATICS AND STATISTICS | 2024年 / 53卷 / 03期
关键词
Feature selection; classification; particle swarm optimization; chaotic map; transfer function; factorial design; GENE SELECTION; MUTUAL INFORMATION; ALGORITHM; CLASSIFICATION; SCHEME;
D O I
10.15672/hujms.1346686
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Feature selection, a common and crucial problem in current scientific research, is a crucial data preprocessing technique and a combinatorial optimization task. Feature selection aims to select a subset of informative and appropriate features from the original feature dataset. Therefore, improving performance on the classification task requires processing the original data using a feature selection strategy before the learning process. Particle swarm optimization, one of the metaheuristic algorithms that prevents the growth of computing complexity, can solve the feature selection problem satisfactorily and quickly with appropriate classification accuracy since it has local optimum escape strategies. There are arbitrary trial and error approaches described separately in the literature to determine the critical binary particle swarm optimization parameters, which are the inertial weight, the transfer function, the threshold value, and the swarm size, that directly affect the performance of the binary particle swarm optimization algorithm parameters used in feature selection. Unlike these approaches, this paper enables us to obtain scientific findings by evaluating all binary particle swarm optimization parameters together with the help of a statistically based factorial design approach. The results show how well the threshold and the transfer function have statistically affected the binary particle swarm optimization algorithm performance.
引用
收藏
页码:879 / 896
页数:18
相关论文
共 70 条
[1]   Chaos embedded particle swarm optimization algorithms [J].
Alatas, Bilal ;
Akin, Erhan ;
Ozer, A. Bedri .
CHAOS SOLITONS & FRACTALS, 2009, 40 (04) :1715-1734
[2]   Statistical determination of significant particle swarm optimization parameters: the case of Weibull distribution [J].
Alptekin, Bulent ;
Acitas, Sukru ;
Senoglu, Birdal ;
Aladag, Cagdas Hakan .
SOFT COMPUTING, 2022, 26 (22) :12623-12634
[3]   Genetic Bee Colony (GBC) algorithm: A new gene selection method for microarray cancer classification [J].
Alshamlan, Hala M. ;
Badr, Ghada H. ;
Alohali, Yousef A. .
COMPUTATIONAL BIOLOGY AND CHEMISTRY, 2015, 56 :49-60
[4]  
[Anonymous], 2002, P C SIM EV LEARN
[5]   A Modified Binary Particle Swarm Optimization for Knapsack Problems [J].
Bansal, Jagdish Chand ;
Deep, Kusum .
APPLIED MATHEMATICS AND COMPUTATION, 2012, 218 (22) :11042-11061
[6]   DYNAMIC PROGRAMMING [J].
BELLMAN, R .
SCIENCE, 1966, 153 (3731) :34-&
[7]   Information-theoretic selection of high-dimensional spectral features for structural recognition [J].
Bonev, Boyan ;
Escolano, Francisco ;
Giorgi, Daniela ;
Biasotti, Silvia .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2013, 117 (03) :214-228
[8]   Hybrid particle swarm optimization with spiral-shaped mechanism for feature selection [J].
Chen, Ke ;
Zhou, Feng-Yu ;
Yuan, Xian-Feng .
EXPERT SYSTEMS WITH APPLICATIONS, 2019, 128 :140-156
[9]   An improved binary particle swarm optimization combing V-shaped and U-shaped transfer function [J].
Chen, Yuxiang ;
Liu, Jianhua ;
Zhu, Jian ;
Wang, Zihang .
EVOLUTIONARY INTELLIGENCE, 2023, 16 (05) :1653-1666
[10]  
Cherkassky V, 1997, IEEE Trans Neural Netw, V8, P1564, DOI 10.1109/TNN.1997.641482