Principal Component Analysis on Graph-Hessian

被引:0
作者
Pan, Yichen [1 ]
Liu, Weifeng [1 ]
Zhou, Yicong [2 ]
Nie, Liqiang [3 ]
机构
[1] China Univ Petr East China, Coll Control Sci & Engn, Qingdao, Peoples R China
[2] Univ Macau, Fac Sci & Technol, Macau, Peoples R China
[3] Shandong Univ, Sch Comp Sci & Technol, Qingdao, Peoples R China
来源
2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019) | 2019年
基金
中国国家自然科学基金;
关键词
dimensionality reduction; principal component analysis; manifold learning; graph; hessian regularization; PCA;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal Component Analysis (PCA) is a widely used linear dimensionality reduction method, which assumes that the data are drawn from a low-dimensional affine subspace of a high-dimensional space. However, it only uses the feature information of the samples. By exploiting structural information of data and embedding it into the PCA framework, the local positional relationship between samples in the original space can be preserved, so that the performance of downstream tasks based on PCA can be improved. In this paper, we introduce Hessian regularization into PCA and propose a new model called Graph-Hessian Principal Component Analysis (GHPCA). Hessian can correctly use the intrinsic local geometry of the data manifold. It is better able to maintain the neighborhood relationship between data in high-dimensional space. Compared with other Laplacian-based models, our model can obtain more abundant structural information after dimensionality reduction, and it can better restore low-dimensional structures. By comparing with several methods of PCA, GLPCA, RPCA and RPCAG, through the K-means clustering experiments on USPS handwritten digital dataset, YALE face dataset and COIL20 object image dataset, it is proved that our models are superior to other principal component analysis models in clustering tasks.
引用
收藏
页码:1494 / 1501
页数:8
相关论文
共 33 条
[21]   Multiview Hessian regularized logistic regression for action recognition [J].
Liu, Weifeng ;
Liu, Hongli ;
Tao, Dapeng ;
Wang, Yanjiang ;
Lu, Ke .
SIGNAL PROCESSING, 2015, 110 :101-107
[22]   Multiview Hessian Regularization for Image Annotation [J].
Liu, Weifeng ;
Tao, Dacheng .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2013, 22 (07) :2676-2687
[23]  
Nene S. A., 1996, Tech. Rep. CUCS-005-96, P1
[24]  
Petersen K. B., 2008, Technical University of Denmark, V7, P510
[25]  
Rene V., 2016, GEN PRINCIPAL COMPON
[26]   Nonlinear component analysis as a kernel eigenvalue problem [J].
Scholkopf, B ;
Smola, A ;
Muller, KR .
NEURAL COMPUTATION, 1998, 10 (05) :1299-1319
[27]   The adaptive block sparse PCA and its application to multi-subject FMRI data analysis using sparse mCCA [J].
Seghouane, Abd-Krim ;
Iqbal, Asif .
SIGNAL PROCESSING, 2018, 153 :311-320
[28]   Robust Principal Component Analysis on Graphs [J].
Shahid, Nauman ;
Kalofolias, Vassilis ;
Bresson, Xavier ;
Bronsteint, Michael ;
Vandergheynst, Pierre .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :2812-2820
[29]   Sparse principal component analysis via regularized low rank matrix approximation [J].
Shen, Haipeng ;
Huang, Jianhua Z. .
JOURNAL OF MULTIVARIATE ANALYSIS, 2008, 99 (06) :1015-1034
[30]   Probabilistic principal component analysis [J].
Tipping, ME ;
Bishop, CM .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 1999, 61 :611-622