Multi-layer manifold learning with feature selection

被引:5
作者
Dornaika, F. [1 ,2 ]
机构
[1] Univ Basque Country, UPV EHU, San Sebastian, Spain
[2] Basque Fdn Sci, Ikerbasque, Bilbao, Spain
基金
欧洲研究理事会;
关键词
Data embedding; Feature selection; Feature extraction; Manifold learning; Face recognition; Pattern classification; LOCALITY PRESERVING PROJECTIONS; FEATURE-EXTRACTION; DIMENSIONALITY REDUCTION; FRAMEWORK; PLUS;
D O I
10.1007/s10489-019-01563-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many fundamental problems in machine learning require some form of dimensionality reduction. To this end, two different strategies were used: Manifold Learning and Feature Selection. Manifold learning (or data embedding) attempts to compute a subspace from original data by feature recombination/transformation. Feature selection aims to select the most relevant features in the original space. In this paper, we propose a novel cooperative Manifold learning-Feature selection that goes beyond the simple concatenation of these two modules. Our basic idea is to transform a given shallow embedding to a deep variant by computing a cascade of embeddings in which each embedding undergoes feature selection and elimination. We use filter approaches in order to efficiently select irrelevant features at any stage of the process. For a case study, our proposed framework was used with two typical linear embedding algorithms: Local Discriminant Embedding (LDE) (a supervised technique) and Locality Preserving Projections (LPP) (unsupervised technique) on four challenging face databases and it has been conveniently compared with other cooperative schemes. Moreover, a comparison with several state-of-the-art manifold learning methods is provided. As it is exhibited by our experimental study, the proposed framework can achieve superior learning performance with respect to classic cooperative schemes and to many competing manifold learning methods.
引用
收藏
页码:1859 / 1871
页数:13
相关论文
共 55 条
  • [1] Aghazadeh RSA, 2018, ICML
  • [2] Mrmr plus and Cfs plus feature selection algorithms for high-dimensional data
    Angulo, Adrian Pino
    Shin, Kilho
    [J]. APPLIED INTELLIGENCE, 2019, 49 (05) : 1954 - 1967
  • [3] [Anonymous], 2019, PATTERN RECOGNITION
  • [4] [Anonymous], 2019, IEEE T CYBERNETICS
  • [5] Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection
    Belhumeur, PN
    Hespanha, JP
    Kriegman, DJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) : 711 - 720
  • [6] Laplacian eigenmaps for dimensionality reduction and data representation
    Belkin, M
    Niyogi, P
    [J]. NEURAL COMPUTATION, 2003, 15 (06) : 1373 - 1396
  • [7] Borg I., 2005, MODERN MULTIDIMENSIO
  • [8] Speed up kernel discriminant analysis
    Cai, Deng
    He, Xiaofei
    Han, Jiawei
    [J]. VLDB JOURNAL, 2011, 20 (01) : 21 - 33
  • [9] A survey on feature selection methods
    Chandrashekar, Girish
    Sahin, Ferat
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2014, 40 (01) : 16 - 28
  • [10] Chen HT, 2005, IEEE I CONF COMP VIS, P1371