Binary feature mask optimization for feature selection

被引:0
作者
Lorasdagi, Mehmet E. [1 ]
Turali, Mehmet Y. [2 ]
Kozat, Suleyman S. [1 ]
机构
[1] Department of Electrical and Electronics Engineering, Bilkent University, Ankara
[2] Department of Electrical and Computer Engineering, University of California Los Angeles, Los Angeles
关键词
Dimensionality reduction; Feature selection; Machine learning; Wrapper methods;
D O I
10.1007/s00521-024-10913-9
中图分类号
学科分类号
摘要
We investigate feature selection problem for generic machine learning models. We introduce a novel framework that selects features considering the outcomes of the model. Our framework introduces a novel feature masking approach to eliminate the features during the selection process, instead of completely removing them from the dataset. This allows us to use the same machine learning model during feature selection, unlike other feature selection methods where we need to train the machine learning model again as the dataset has different dimensions on each iteration. We obtain the mask operator using the predictions of the machine learning model, which offers a comprehensive view on the subsets of the features essential for the predictive performance of the model. A variety of approaches exist in the feature selection literature. However, to our knowledge, no study has introduced a training-free framework for a generic machine learning model to select features while considering the importance of the feature subsets as a whole, instead of focusing on the individual features. We demonstrate significant performance improvements on the real-life datasets under different settings using LightGBM and multilayer perceptron as our machine learning models. Our results show that our methods outperform traditional feature selection techniques. Specifically, in experiments with the residential building dataset, our general binary mask optimization algorithm has reduced the mean squared error by up to 49% compared to conventional methods, achieving a mean squared error of 0.0044. The high performance of our general binary mask optimization algorithm stems from its feature masking approach to select features and its flexibility in the number of selected features. The algorithm selects features based on the validation performance of the machine learning model. Hence, the number of selected features is not predetermined and adjusts dynamically to the dataset. Additionally, we openly share the implementation or our code to encourage further research in this area. © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2024.
引用
收藏
页码:5155 / 5167
页数:12
相关论文
共 21 条
[1]  
Bishop C.M., Pattern Recognition and Machine Learning, (2006)
[2]  
Hashemi A., Pajoohan M.-R., Dowlatshahi M.B., Nsofs: a non-dominated sorting-based online feature selection algorithm, Neural Comput Appl, 36, pp. 1181-1197, (2023)
[3]  
Ghosh T., Kirby M., Nonlinear feature selection using sparsity-promoted centroid-encoder, Neural Comput Appl, 35, pp. 21883-21902, (2023)
[4]  
Karlupia N., Abrol P., Wrapper-based optimized feature selection using nature-inspired algorithms, Neural Comput Appl, 35, pp. 12675-12689, (2023)
[5]  
Batur Sahin C., Abualigah L., A novel deep learning-based feature selection model for improving the static analysis of vulnerability detection, Neural Comput Appl, 33, pp. 14049-14067, (2021)
[6]  
Shi Y., Miao J., Niu L., Feature selection with MCP 2 regularization, Neural Comput Appl, 31, pp. 6699-6709, (2019)
[7]  
Pudjihartono N., Fadason T., Kempa-Liehr A.W., O'Sullivan J.M., A review of feature selection methods for machine learning-based disease risk prediction, Front Bioinform, 2, (2022)
[8]  
Guyon I., Elisseeff A., An introduction to variable and feature selection, J Mach Learn Res, 3, pp. 1157-1182, (2003)
[9]  
Vinh L.T., Lee S., Park Y.-T., d'Auriol B.J., A novel feature selection method based on normalized mutual information, Appl Intell, 37, pp. 100-120, (2012)
[10]  
Cover T., Thomas J., Elements of Information Theory, (2006)