A generalized least-squares approach regularized with graph embedding for dimensionality reduction

被引:44
作者
Shen, Xiang-Jun [1 ]
Liu, Si-Xing [1 ]
Bao, Bing-Kun [2 ]
Pan, Chun-Hong [3 ]
Zha, Zheng-Jun [4 ]
Fan, Jianping [5 ]
机构
[1] JiangSu Univ, Sch Comp Sci & Commun Engn, Nanjing 212013, Jiangsu, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Nanjing, Jiangsu, Peoples R China
[3] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
[4] Univ Sci & Technol China, Sch Informat Sci & Technol, Hefei, Anhui, Peoples R China
[5] Univ N Carolina, Dept Comp Sci, Charlotte, NC 28223 USA
基金
中国国家自然科学基金;
关键词
Dimensionality reduction; Graph embedding; Subspace learning; Least-squares; PRESERVING PROJECTIONS; EIGENMAPS; FRAMEWORK;
D O I
10.1016/j.patcog.2019.107023
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In current graph embedding methods, low dimensional projections are obtained by preserving either global geometrical structure of data or local geometrical structure of data. In this paper, the PCA (Principal Component Analysis) idea of minimizing least-squares reconstruction errors is regularized with graph embedding, to unify various local manifold embedding methods within a generalized framework to keep global and local low dimensional subspace. Different from the well-known PCA method, our proposed generalized least-squares approach considers data distributions together with an instance penalty in each data point. In this way, PCA is viewed as a special instance of our proposed generalized least squares framework for preserving global projections. Applying a regulation of graph embedding, we can obtain projection that preserves both intrinsic geometrical structure and global structure of data. From the experimental results on a variety of face and handwritten digit recognition, our proposed method has advantage of superior performances in keeping lower dimensional subspaces and higher classification results than state-of-the-art graph embedding methods. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页数:10
相关论文
共 48 条
  • [11] Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
    Donoho, DL
    Grimes, C
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2003, 100 (10) : 5591 - 5596
  • [12] Dutta A., 2019, 33 AAAI C ART INT
  • [13] Kinematic errors prediction for multi-axis machine tools' guideways based on tolerance
    Fan, Jinwei
    Tao, Haohao
    Wu, Changjun
    Pan, Ri
    Tang, Yuhang
    Li, Zhongsheng
    [J]. INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2018, 98 (5-8) : 1131 - 1144
  • [14] Nonlinear fault detection of batch processes based on functional kernel locality preserving projections
    He, Fei
    Wang, Chaojun
    Fan, Shu-Kai S.
    [J]. CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2018, 183 : 79 - 89
  • [15] He XF, 2005, IEEE I CONF COMP VIS, P1208
  • [16] Face recognition using Laplacianfaces
    He, XF
    Yan, SC
    Hu, YX
    Niyogi, P
    Zhang, HJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (03) : 328 - 340
  • [17] Regularized coplanar discriminant analysis for dimensionality reduction
    Huang, Ke-Kun
    Dai, Dao-Qing
    Ren, Chuan-Xian
    [J]. PATTERN RECOGNITION, 2017, 62 : 87 - 98
  • [18] Robust data representation using locally linear embedding guided PCA
    Jiang, Bo
    Ding, Chris
    Luo, Bin
    [J]. NEUROCOMPUTING, 2018, 275 : 523 - 532
  • [19] Orthogonal neighborhood preserving projections: A projection-based dimensionality reduction technique
    Kokiopoulou, Effrosyni
    Saad, Yousef
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (12) : 2143 - 2156
  • [20] Sparse data-dependent kernel principal component analysis based on least squares support vector machine for feature extraction and recognition
    Li, Jun-Bao
    Gao, Huijun
    [J]. NEURAL COMPUTING & APPLICATIONS, 2012, 21 (08) : 1971 - 1980