Graph-dual Laplacian principal component analysis

被引:5
|
作者
He, Jinrong [1 ,2 ]
Bi, Yingzhou [3 ]
Liu, Bin [1 ,2 ]
Zeng, Zhigao [4 ]
机构
[1] Northwest A& F Univ, Coll Informat Engn, Yangling 712100, Shaanxi, Peoples R China
[2] Minist Agr Peoples Republ China, Key Lab Agr Internet Things, Yangling 712100, Shaanxi, Peoples R China
[3] Guangxi Teachers Educ Univ, Sci Comp & Intelligent Informat Proc Guangxi High, Nanning 530001, Guangxi, Peoples R China
[4] Hunan Univ Technol, Coll Comp & Commun, Xiangtan 412000, Hunan, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Principal component analysis; Graph-Laplacian PCA; Dual graph; Feature manifold; Graph-Dual Laplacian PCA; MATRIX FACTORIZATION; LP-NORM; OPTIMIZATION; ALGORITHM; NETWORK;
D O I
10.1007/s12652-018-1096-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal component analysis is the most widely used method for linear dimensionality reduction, due to its effectiveness in exploring low-dimensional global geometric structures embedded in data. To preserve the intrinsic local geometrical structures of data, graph-Laplacian PCA (gLPCA) incorporates Laplacian embedding into PCA framework for learning local similarities between data points, which leads to significant performance improvement in clustering and classification. Some recent works showed that not only the high dimensional data reside on a low-dimensional manifold in the data space, but also the features lie on a manifold in feature space. However, both PCA and gLPCA overlook the local geometric information contained in the feature space. By considering the duality between data manifold and feature manifold, graph-dual Laplacian PCA (gDLPCA) is proposed, which incorporates data graph regularization and feature graph regularization into PCA framework to exploit local geometric structures of data manifold and feature manifold simultaneously. The experimental results on four benchmark data sets have confirmed its effectiveness and suggested that gDLPCA outperformed gLPCA on classification and clustering tasks.
引用
收藏
页码:3249 / 3262
页数:14
相关论文
共 50 条
  • [31] Ensemble Principal Component Analysis
    Dorabiala, Olga
    Aravkin, Aleksandr Y.
    Kutz, J. Nathan
    IEEE ACCESS, 2024, 12 : 6663 - 6671
  • [33] Regularized Principal Component Analysis
    Yonathan AFLALO
    Ron KIMMEL
    Chinese Annals of Mathematics,Series B, 2017, (01) : 1 - 12
  • [34] Bayesian principal component analysis
    Nounou, MN
    Bakshi, BR
    Goel, PK
    Shen, XT
    JOURNAL OF CHEMOMETRICS, 2002, 16 (11) : 576 - 595
  • [35] Near Infrared Multi-component Prediction Model Based on Principal Component Analysis and Wavelet Neural Network
    Tang Shou-Peng
    Yao Xin-Feng
    Yao Xia
    Tian Yong-Chao
    Cao Wei-Xing
    Zhu Yan
    CHINESE JOURNAL OF ANALYTICAL CHEMISTRY, 2009, 37 (10) : 1445 - 1450
  • [36] A PRINCIPAL COMPONENT ANALYSIS FOR TREES
    Aydin, Burcu
    Pataki, Gabor
    Wang, Haonan
    Bullitt, Elizabeth
    Marron, J. S.
    ANNALS OF APPLIED STATISTICS, 2009, 3 (04) : 1597 - 1615
  • [37] Principal component spectral analysis
    Guo, Hao
    Marfurt, Kurt J.
    Liu, Jianlei
    GEOPHYSICS, 2009, 74 (04) : P35 - P43
  • [38] Adaptive Principal Component Analysis
    Li, Xiangyu
    Wang, Hua
    PROCEEDINGS OF THE 2022 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2022, : 486 - 494
  • [39] A robust principal component analysis
    Ibazizen, M
    Dauxois, J
    STATISTICS, 2003, 37 (01) : 73 - 83
  • [40] On coMADs and Principal Component Analysis
    Kazempour, Daniyal
    Huenemoerder, M. A. X.
    Seidl, Thomas
    SIMILARITY SEARCH AND APPLICATIONS (SISAP 2019), 2019, 11807 : 273 - 280