Dual-dual subspace learning with low-rank consideration for feature selection

被引:2
作者
Moslemi, Amir [1 ]
Bidar, Mahdi [2 ]
机构
[1] Seneca Polytech, Sch Software Design & Data Sci, Toronto, ON M2J 2X5, Canada
[2] Univ Regina, Dept Comp Sci, Regina, SK, Canada
关键词
Nonnegative matrix factorization; Unsupervised feature selection; Regularization; Low-rank; UNSUPERVISED FEATURE-SELECTION; SUPERVISED FEATURE-SELECTION; MATRIX FACTORIZATION; APPROXIMATION;
D O I
10.1016/j.physa.2024.129997
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The performance of machine learning algorithms can be affected by redundant features of highdimensional data. Furthermore, these irrelevant features increase the time of computation for learning model. These problems can be addressed by leveraging different techniques including feature selection and dimensionality reduction. Unsupervised feature selection has drawn increased attention due to the difficulty of label collection for supervised feature selection. To this end, we developed an innovative approach based on nonnegative matrix factorization (NMF) to remove redundant information. In this technique, for the first time, the local information preserving regularization and global information preserving regularization are applied for both feature weight matrix and representation matrix which is why we called Dual-Dual regularized feature selection. Furthermore, Schatten p-norm is utilized to extract inherent low-rank properties of data. To demonstrate the effectiveness of the proposed method, experimental studies are conducted on six benchmark datasets. The computational results show that the proposed method in comparison with state-of-the-art unsupervised feature selection techniques is more efficient for feature selection.
引用
收藏
页数:17
相关论文
共 68 条
[1]  
Cai D., 2010, Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining, P333, DOI [10.1145/1835804.1835848, DOI 10.1145/1835804.1835848]
[2]   Graph Regularized Nonnegative Matrix Factorization for Data Representation [J].
Cai, Deng ;
He, Xiaofei ;
Han, Jiawei ;
Huang, Thomas S. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (08) :1548-1560
[3]   Multi-view unsupervised feature selection with consensus partition and diverse graph [J].
Cao, Zhiwen ;
Xie, Xijiong ;
Li, Yuqi .
INFORMATION SCIENCES, 2024, 661
[4]   Adaptive unsupervised feature selection with robust graph regularization [J].
Cao, Zhiwen ;
Xie, Xijiong ;
Sun, Feixiang .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (02) :341-354
[5]   Consensus cluster structure guided multi-view unsupervised feature selection [J].
Cao, Zhiwen ;
Xie, Xijiong ;
Sun, Feixiang ;
Qian, Jiabei .
KNOWLEDGE-BASED SYSTEMS, 2023, 271
[6]   A comprehensive survey on recent metaheuristics for feature selection [J].
Dokeroglu, Tansel ;
Deniz, Ayca ;
Kiziloz, Hakan Ezgi .
NEUROCOMPUTING, 2022, 494 :269-296
[7]   Joint Structured Bipartite Graph and Row-Sparse Projection for Large-Scale Feature Selection [J].
Dong, Xia ;
Nie, Feiping ;
Wu, Danyang ;
Wang, Rong ;
Li, Xuelong .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (04) :6911-6924
[8]   Unsupervised Feature Selection with Adaptive Structure Learning [J].
Du, Liang ;
Shen, Yi-Dong .
KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, :209-218
[9]  
Han D, 2015, PROC CVPR IEEE, P5016, DOI 10.1109/CVPR.2015.7299136
[10]  
Han K, 2018, 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), P2941, DOI 10.1109/ICASSP.2018.8462261