A graph regularized dimension reduction method for out-of-sample data

被引:11
|
作者
Tang, Mengfan [1 ]
Nie, Feiping [2 ,3 ]
Jain, Ramesh [1 ]
机构
[1] Univ Calif Irvine, Dept Comp Sci, Irvine, CA 92697 USA
[2] Northwestern Polytech Univ, Sch Comp Sci, Xian, Peoples R China
[3] Northwestern Polytech Univ, Ctr OPT IMagery Anal & Learning OPTIMAL, Xian, Peoples R China
关键词
Dimension reduction; Out-of-sample data; Graph regularized PCA; Manifold learning; Clustering; RECOGNITION; EIGENMAPS;
D O I
10.1016/j.neucom.2016.11.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Among various dimension reduction techniques, Principal Component Analysis (PCA) is specialized in treating vector data, whereas Laplacian embedding is often employed for embedding graph data. Moreover, graph regularized PCA, a combination of both techniques, has also been developed to assist the learning of a low dimensional representation of vector data by incorporating graph data. However, these approaches are confronted by the out-of-sample problem: each time when new data is added, it has to be combined with the old data before being fed into the algorithm to re-compute the eigenvectors, leading to enormous computational cost. In order to address this problem, we extend the graph regularized PCA to the graph regularized linear regression PCA (grlrPCA). grlrPCA eliminates the redundant calculation on the old data by first learning a linear function and then directly applying it to the new data for its dimension reduction. Furthermore, we derive an efficient iterative algorithm to solve grlrPCA optimization problem and show the close relatedness of grlrPCA and unsupervised Linear Discriminant Analysis at infinite regularization parameter limit. The evaluations of multiple metrics on seven realistic datasets demonstrate that grlrPCA outperforms established unsupervised dimension reduction algorithms.
引用
收藏
页码:58 / 63
页数:6
相关论文
共 50 条
  • [21] A data-adaptive hybrid method for dimension reduction
    Zhu, Li-Ping
    Zhu, Li-Xing
    JOURNAL OF NONPARAMETRIC STATISTICS, 2009, 21 (07) : 851 - 861
  • [22] An Impartial Trimming Approach for Joint Dimension and Sample Reduction
    Luca Greco
    Antonio Lucadamo
    Pietro Amenta
    Journal of Classification, 2020, 37 : 769 - 788
  • [23] An Impartial Trimming Approach for Joint Dimension and Sample Reduction
    Greco, Luca
    Lucadamo, Antonio
    Amenta, Pietro
    JOURNAL OF CLASSIFICATION, 2020, 37 (03) : 769 - 788
  • [24] Dimension reduction for covariates in network data
    Zhao, Junlong
    Liu, Xiumin
    Wang, Hansheng
    Leng, Chenlei
    BIOMETRIKA, 2022, 109 (01) : 85 - 102
  • [25] Improving out-of-sample forecasts of stock price indexes with forecast reconciliation and clustering
    Mattera, Raffaele
    Athanasopoulos, George
    Hyndman, Rob
    QUANTITATIVE FINANCE, 2024, 24 (11) : 1641 - 1667
  • [26] A dimension reduction technique applied to regression on high dimension, low sample size neurophysiological data sets
    Santana, Adrielle C.
    Barbosa, Adriano V.
    Yehia, Hani C.
    Laboissiere, Rafael
    BMC NEUROSCIENCE, 2021, 22 (01)
  • [27] A dimension reduction technique applied to regression on high dimension, low sample size neurophysiological data sets
    Adrielle C. Santana
    Adriano V. Barbosa
    Hani C. Yehia
    Rafael Laboissière
    BMC Neuroscience, 22
  • [28] Discriminative sparse embedding based on adaptive graph for dimension reduction
    Liu, Zhonghua
    Shi, Kaiming
    Zhang, Kaibing
    Ou, Weihua
    Wang, Lin
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 94
  • [29] ADM: adaptive graph diffusion for meta-dimension reduction
    Feng, Junning
    Liang, Yong
    Yu, Tianwei
    BRIEFINGS IN BIOINFORMATICS, 2024, 26 (01)
  • [30] EXTENDING OUT-OF-SAMPLE MANIFOLD LEARNING VIA META-MODELLING TECHNIQUES
    Taskin, Gulsen
    Crawford, Melba
    2017 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2017, : 562 - 565