Subspace learning for feature selection via rank revealing QR factorization: Fast feature selection

被引:3
|
作者
Moslemi, Amir [1 ]
Ahmadian, Arash [2 ]
机构
[1] Seneca Polytech, Sch Software Design & Data Sci, Toronto, ON, Canada
[2] Univ Toronto, Edward S Rogers Sr Dept Elect & Comp Engn, Toronto, ON M5S 1A1, Canada
关键词
Feature selection; Rank revealing QR factorization; Non-negative matrix factorization; Genetic algorithm and hybrid feature selection; UNSUPERVISED FEATURE-SELECTION; SUPERVISED FEATURE-SELECTION; MATRIX FACTORIZATION; MUTUAL INFORMATION; CLASSIFICATION; OPTIMIZATION; ALGORITHMS; APPROXIMATION; REDUCTION; PATTERNS;
D O I
10.1016/j.eswa.2024.124919
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The identification of informative and distinguishing features from high-dimensional data has gained significant attention in the field of machine learning. Recently, there has been growing interest in employing matrix factorization-based techniques, such as non-negative matrix factorization, for feature selection. The primary objective of feature selection using matrix factorization is to extract a lower-dimensional subspace that captures the essence of the original space. This study introduces a novel unsupervised feature selection technique that leverages rank revealing QR (RRQR) factorization. Compared to singular value decomposition (SVD) and nonnegative matrix factorization (NMF), RRQR is more computationally efficient. The uniqueness of this technique lies in the utilization of the permutation matrix of QR for feature selection. Additionally, we integrate QR factorization into the objective function of NMF to create a new unsupervised feature selection method. Furthermore, we propose a hybrid feature selection algorithm by combining RRQR and a Genetic algorithm. The algorithm eliminates redundant features using RRQR factorization and selects the most distinguishing subset of features using the Genetic algorithm. Experimental comparisons with state-of-the-art feature selection algorithms in supervised, unsupervised, and semi-supervised settings demonstrate the reliability and robustness of the proposed algorithm. The evaluation is conducted on eight microarray datasets using KNN, SVM, and C4.5 classifiers. The experimental results indicate that the proposed method achieves comparable performance to the state-of-the-art feature selection methods. Our empirical findings demonstrate that the proposed method exhibits a significantly lower computational cost compared to other techniques.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Subspace learning for unsupervised feature selection via matrix factorization
    Wang, Shiping
    Pedrycz, Witold
    Zhu, Qingxin
    Zhu, William
    PATTERN RECOGNITION, 2015, 48 (01) : 10 - 19
  • [2] Dual-dual subspace learning with low-rank consideration for feature selection
    Moslemi, Amir
    Bidar, Mahdi
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2024, 651
  • [3] Feature selection by combining subspace learning with sparse representation
    Cheng, Debo
    Zhang, Shichao
    Liu, Xingyi
    Sun, Ke
    Zong, Ming
    MULTIMEDIA SYSTEMS, 2017, 23 (03) : 285 - 291
  • [4] Subspace learning for unsupervised feature selection via adaptive structure learning and rank approximation
    Shang, Ronghua
    Xu, Kaiming
    Jiao, Licheng
    NEUROCOMPUTING, 2020, 413 (413) : 72 - 84
  • [5] Subspace Learning and Feature Selection via Orthogonal Mapping
    Mandanas, Fotios D.
    Kotropoulos, Constantine L.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 1034 - 1047
  • [6] Dual regularized subspace learning using adaptive graph learning and rank constraint: Unsupervised feature selection on gene expression microarray datasets
    Moslemi, Amir
    Ahmadian, Arash
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 167
  • [7] Unsupervised feature selection by combining subspace learning with feature self-representation
    Li, Yangding
    Lei, Cong
    Fang, Yue
    Hu, Rongyao
    Li, Yonggang
    Zhang, Shichao
    PATTERN RECOGNITION LETTERS, 2018, 109 : 35 - 43
  • [8] Large Margin Subspace Learning for feature selection
    Liu, Bo
    Fang, Bin
    Liu, Xinwang
    Chen, Jie
    Huang, Zhenghong
    He, Xiping
    PATTERN RECOGNITION, 2013, 46 (10) : 2798 - 2806
  • [9] Transfer Learning via Feature Selection Based Nonnegative Matrix Factorization
    Balasubramaniam, Thirunavukarasu
    Nayak, Richi
    Yuen, Chau
    WEB INFORMATION SYSTEMS ENGINEERING - WISE 2019, 2019, 11881 : 82 - 97
  • [10] A tutorial-based survey on feature selection: Recent advancements on feature selection
    Moslemi, Amir
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126