Principal Component Analysis on Graph-Hessian

被引:0
作者
Pan, Yichen [1 ]
Liu, Weifeng [1 ]
Zhou, Yicong [2 ]
Nie, Liqiang [3 ]
机构
[1] China Univ Petr East China, Coll Control Sci & Engn, Qingdao, Peoples R China
[2] Univ Macau, Fac Sci & Technol, Macau, Peoples R China
[3] Shandong Univ, Sch Comp Sci & Technol, Qingdao, Peoples R China
来源
2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019) | 2019年
基金
中国国家自然科学基金;
关键词
dimensionality reduction; principal component analysis; manifold learning; graph; hessian regularization; PCA;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal Component Analysis (PCA) is a widely used linear dimensionality reduction method, which assumes that the data are drawn from a low-dimensional affine subspace of a high-dimensional space. However, it only uses the feature information of the samples. By exploiting structural information of data and embedding it into the PCA framework, the local positional relationship between samples in the original space can be preserved, so that the performance of downstream tasks based on PCA can be improved. In this paper, we introduce Hessian regularization into PCA and propose a new model called Graph-Hessian Principal Component Analysis (GHPCA). Hessian can correctly use the intrinsic local geometry of the data manifold. It is better able to maintain the neighborhood relationship between data in high-dimensional space. Compared with other Laplacian-based models, our model can obtain more abundant structural information after dimensionality reduction, and it can better restore low-dimensional structures. By comparing with several methods of PCA, GLPCA, RPCA and RPCAG, through the K-means clustering experiments on USPS handwritten digital dataset, YALE face dataset and COIL20 object image dataset, it is proved that our models are superior to other principal component analysis models in clustering tasks.
引用
收藏
页码:1494 / 1501
页数:8
相关论文
共 33 条
  • [1] [Anonymous], 2009, Sparse and low-rank matrix decomposition via alternating direction methods"
  • [2] Distributed optimization and statistical learning via the alternating direction method of multipliers
    Boyd S.
    Parikh N.
    Chu E.
    Peleato B.
    Eckstein J.
    [J]. Foundations and Trends in Machine Learning, 2010, 3 (01): : 1 - 122
  • [3] A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION
    Cai, Jian-Feng
    Candes, Emmanuel J.
    Shen, Zuowei
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) : 1956 - 1982
  • [4] PROJECTED GRADIENT METHODS FOR LINEARLY CONSTRAINED PROBLEMS
    CALAMAI, PH
    MORE, JJ
    [J]. MATHEMATICAL PROGRAMMING, 1987, 39 (01) : 93 - 116
  • [5] Robust Principal Component Analysis?
    Candes, Emmanuel J.
    Li, Xiaodong
    Ma, Yi
    Wright, John
    [J]. JOURNAL OF THE ACM, 2011, 58 (03)
  • [6] Data-intensive applications, challenges, techniques and technologies: A survey on Big Data
    Chen, C. L. Philip
    Zhang, Chun-Yang
    [J]. INFORMATION SCIENCES, 2014, 275 : 314 - 347
  • [7] Ding C., 2004, P 21 INT C MACH LEAR, P29, DOI DOI 10.1145/1015330.1015408
  • [8] Ding C., 2006, P 23 INT C MACH LEAR, V281-288
  • [9] Adaptive kernel principal component analysis
    Ding, Mingtao
    Tian, Zheng
    Xu, Haixia
    [J]. SIGNAL PROCESSING, 2010, 90 (05) : 1542 - 1553
  • [10] Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
    Donoho, DL
    Grimes, C
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2003, 100 (10) : 5591 - 5596