A generalized least-squares approach regularized with graph embedding for dimensionality reduction

被引:44
作者
Shen, Xiang-Jun [1 ]
Liu, Si-Xing [1 ]
Bao, Bing-Kun [2 ]
Pan, Chun-Hong [3 ]
Zha, Zheng-Jun [4 ]
Fan, Jianping [5 ]
机构
[1] JiangSu Univ, Sch Comp Sci & Commun Engn, Nanjing 212013, Jiangsu, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Nanjing, Jiangsu, Peoples R China
[3] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
[4] Univ Sci & Technol China, Sch Informat Sci & Technol, Hefei, Anhui, Peoples R China
[5] Univ N Carolina, Dept Comp Sci, Charlotte, NC 28223 USA
基金
中国国家自然科学基金;
关键词
Dimensionality reduction; Graph embedding; Subspace learning; Least-squares; PRESERVING PROJECTIONS; EIGENMAPS; FRAMEWORK;
D O I
10.1016/j.patcog.2019.107023
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In current graph embedding methods, low dimensional projections are obtained by preserving either global geometrical structure of data or local geometrical structure of data. In this paper, the PCA (Principal Component Analysis) idea of minimizing least-squares reconstruction errors is regularized with graph embedding, to unify various local manifold embedding methods within a generalized framework to keep global and local low dimensional subspace. Different from the well-known PCA method, our proposed generalized least-squares approach considers data distributions together with an instance penalty in each data point. In this way, PCA is viewed as a special instance of our proposed generalized least squares framework for preserving global projections. Applying a regulation of graph embedding, we can obtain projection that preserves both intrinsic geometrical structure and global structure of data. From the experimental results on a variety of face and handwritten digit recognition, our proposed method has advantage of superior performances in keeping lower dimensional subspaces and higher classification results than state-of-the-art graph embedding methods. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页数:10
相关论文
共 48 条
  • [1] Manifold Alignment via Global and Local Structures Preserving PCA Framework
    Abeo, Timothy Apasiba
    Shen, Xiang-Jun
    Ganaa, Ernest Domanaanmwi
    Zhu, Qian
    Bao, Bing-Kun
    Zha, Zheng-Jun
    [J]. IEEE ACCESS, 2019, 7 : 38123 - 38134
  • [2] [Anonymous], P IJCAI
  • [3] [Anonymous], 2016, P INT JOINT C ART IN
  • [4] [Anonymous], 2014, INT C LEARNING REPRE
  • [5] Laplacian eigenmaps for dimensionality reduction and data representation
    Belkin, M
    Niyogi, P
    [J]. NEURAL COMPUTATION, 2003, 15 (06) : 1373 - 1396
  • [6] Orthogonal laplacianfaces for face recognition
    Cai, Deng
    He, Xiaofei
    Han, Jiawei
    Zhang, Hong-Jiang
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2006, 15 (11) : 3608 - 3614
  • [7] Graph Regularized Nonnegative Matrix Factorization for Data Representation
    Cai, Deng
    He, Xiaofei
    Han, Jiawei
    Huang, Thomas S.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (08) : 1548 - 1560
  • [8] Robust Principal Component Analysis?
    Candes, Emmanuel J.
    Li, Xiaodong
    Ma, Yi
    Wright, John
    [J]. JOURNAL OF THE ACM, 2011, 58 (03)
  • [9] Cao SS, 2016, AAAI CONF ARTIF INTE, P1145
  • [10] A Least-Squares Framework for Component Analysis
    De la Torre, Fernando
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (06) : 1041 - 1055