Deep Feature Selection using an Enhanced Sparse Group Lasso Algorithm

被引:0
作者
Farokhmanesh, Fatemeh [1 ]
Sadeghi, Mohammad Taghi [1 ]
机构
[1] Yazd Univ, Dept Elect Engn, Yazd, Iran
来源
2019 27TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE 2019) | 2019年
关键词
feature selection; lasso; sparse representation; deep learning; REGRESSION;
D O I
10.1109/iraniancee.2019.8786386
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Feature selection is an important method of data dimensionality reduction widely used in machine learning. In this framework, the sparse representation based feature selection methods are very attractive. This is because of the nature of these methods which try to represent a data with as less as possible non-zero coefficients. In deep neural networks, a very high dimensional feature space is usually existed. In such a situation, one can take advantages of the feature selection approaches into account. In this paper, first, three sparse feature selection methods are compared. The Sparse Group Lasso (SGL) algorithm is one of the adopted approaches. This method is theoretically very well-organized and leads to good results for man-made features. The most important property of this method is that it highly induces the sparsity to the data. A main step of the SGL method is the features grouping step. In this paper, a k-means clustering based method is applied for grouping of the features. Our experimental results show that this sparse representation based method leads to very successful results in deep neural networks.
引用
收藏
页码:1549 / 1552
页数:4
相关论文
共 50 条
[21]   Algorithm Selection Using Deep Learning Without Feature Extraction [J].
Alissa, Mohamad ;
Sim, Kevin ;
Hart, Emma .
PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, :198-206
[22]   Hospital readmission prediction based on improved feature selection using grey relational analysis and LASSO [J].
Miswan, Nor Hamizah ;
Chan, Chee Seng ;
Ng, Chong Guan .
GREY SYSTEMS-THEORY AND APPLICATION, 2021, 11 (04) :796-812
[23]   Fused lasso for feature selection using structural information [J].
Cui, Lixin ;
Bai, Lu ;
Wang, Yue ;
Yu, Philip S. ;
Hancock, Edwin R. .
PATTERN RECOGNITION, 2021, 119
[24]   Deep-FS: A feature selection algorithm for Deep Boltzmann Machines [J].
Taherkhani, Aboozar ;
Cosma, Georgina ;
McGinnity, T. M. .
NEUROCOMPUTING, 2018, 322 :22-37
[25]   BP Neural Network Feature Selection Based on Group Lasso Regularization [J].
Liu, Tiqian ;
Xiao, Jiang-Wen ;
Huang, Zhengyi ;
Kong, Erdan ;
Liang, Yuntao .
2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, :2786-2790
[26]   Feature Selection With Group-Sparse Stochastic Gates [J].
Park, Hyeryn ;
Lee, Changhee .
IEEE ACCESS, 2024, 12 :102299-102312
[27]   Automated Pavement Crack Detection Using Deep Feature Selection and Whale Optimization Algorithm [J].
Alshawabkeh, Shorouq ;
Wu, Li ;
Dong, Daojun ;
Cheng, Yao ;
Li, Liping ;
Alanaqreh, Mohammad .
CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 77 (01) :63-77
[28]   Unsupervised Feature Selection Algorithm Based on Sparse Representation [J].
Cui, Guoqing ;
Yang, Jie ;
Zareapoor, Masoumeh ;
Wang, Jiechen .
2016 3RD INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2016, :1028-1033
[29]   Simultaneous variable and factor selection via sparse group lasso in factor analysis [J].
Dang, Yuanchu ;
Wang, Qing .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2019, 89 (14) :2744-2764
[30]   Identification of hospital cost drivers using sparse group lasso [J].
Swierkowski, Piotr ;
Barnett, Adrian .
PLOS ONE, 2018, 13 (10)