Dual-dual subspace learning with low-rank consideration for feature selection

被引:2
作者
Moslemi, Amir [1 ]
Bidar, Mahdi [2 ]
机构
[1] Seneca Polytech, Sch Software Design & Data Sci, Toronto, ON M2J 2X5, Canada
[2] Univ Regina, Dept Comp Sci, Regina, SK, Canada
关键词
Nonnegative matrix factorization; Unsupervised feature selection; Regularization; Low-rank; UNSUPERVISED FEATURE-SELECTION; SUPERVISED FEATURE-SELECTION; MATRIX FACTORIZATION; APPROXIMATION;
D O I
10.1016/j.physa.2024.129997
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The performance of machine learning algorithms can be affected by redundant features of highdimensional data. Furthermore, these irrelevant features increase the time of computation for learning model. These problems can be addressed by leveraging different techniques including feature selection and dimensionality reduction. Unsupervised feature selection has drawn increased attention due to the difficulty of label collection for supervised feature selection. To this end, we developed an innovative approach based on nonnegative matrix factorization (NMF) to remove redundant information. In this technique, for the first time, the local information preserving regularization and global information preserving regularization are applied for both feature weight matrix and representation matrix which is why we called Dual-Dual regularized feature selection. Furthermore, Schatten p-norm is utilized to extract inherent low-rank properties of data. To demonstrate the effectiveness of the proposed method, experimental studies are conducted on six benchmark datasets. The computational results show that the proposed method in comparison with state-of-the-art unsupervised feature selection techniques is more efficient for feature selection.
引用
收藏
页数:17
相关论文
共 68 条
[31]  
Moslemi Amir, 2024, Acad. Radiol.
[32]   Unsupervised feature selection using orthogonal encoder-decoder factorization [J].
Mozafari, Maryam ;
Seyedi, Seyed Amjad ;
Mohammadiani, Rojiar Pir ;
Tab, Fardin Akhlaghian .
INFORMATION SCIENCES, 2024, 663
[33]   Subspace Sparse Discriminative Feature Selection [J].
Nie, Feiping ;
Wang, Zheng ;
Tian, Lai ;
Wang, Rong ;
Li, Xuelong .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (06) :4221-4233
[34]  
Nie FP, 2016, AAAI CONF ARTIF INTE, P1302
[35]  
Nie Feiping, 2020, IEEE Trans. Neural Netw. Learn. Syst., V33, P1702
[36]   Low-rank dictionary learning for unsupervised feature selection [J].
Parsa, Mohsen Ghassemi ;
Zare, Hadi ;
Ghatee, Mehdi .
EXPERT SYSTEMS WITH APPLICATIONS, 2022, 202
[37]   Unsupervised feature selection based on adaptive similarity learning and subspace clustering [J].
Parsa, Mohsen Ghassemi ;
Zare, Hadi ;
Ghatee, Mehdi .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 95
[38]   Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy [J].
Peng, HC ;
Long, FH ;
Ding, C .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (08) :1226-1238
[39]   Unsupervised feature selection by regularized matrix factorization [J].
Qi, Miao ;
Wang, Ting ;
Liu, Fucong ;
Zhang, Baoxue ;
Wang, Jianzhong ;
Yi, Yugen .
NEUROCOMPUTING, 2018, 273 :593-610
[40]   Dual Regularized Unsupervised Feature Selection Based on Matrix Factorization and Minimum Redundancy with application in gene selection [J].
Saberi-Movahed, Farid ;
Rostami, Mehrdad ;
Berahmand, Kamal ;
Karami, Saeed ;
Tiwari, Prayag ;
Oussalah, Mourad ;
Band, Shahab S. .
KNOWLEDGE-BASED SYSTEMS, 2022, 256