Non-Greedy L21-Norm Maximization for Principal Component Analysis

被引:14
作者
Nie, Feiping [1 ,2 ]
Tian, Lai [1 ,2 ]
Huang, Heng [3 ]
Ding, Chris [4 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Peoples R China
[2] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian 710072, Peoples R China
[3] Univ Pittsburgh, Dept Elect & Comp Engn, Pittsburgh, PA 15261 USA
[4] Univ Texas Arlington, Dept Comp Sci & Engn, Arlington, TX 76019 USA
基金
中国国家自然科学基金;
关键词
Principal component analysis; Minimization; Covariance matrices; Robustness; Optimization; Convergence; Linear programming; robust dimensionality reduction; L21-norm maximization; FRAMEWORK;
D O I
10.1109/TIP.2021.3073282
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal Component Analysis (PCA) is one of the most important unsupervised methods to handle high-dimensional data. However, due to the high computational complexity of its eigen-decomposition solution, it is hard to apply PCA to the large-scale data with high dimensionality, e.g., millions of data points with millions of variables. Meanwhile, the squared L2-norm based objective makes it sensitive to data outliers. In recent research, the L1-norm maximization based PCA method was proposed for efficient computation and being robust to outliers. However, this work used a greedy strategy to solve the eigenvectors. Moreover, the L1-norm maximization based objective may not be the correct robust PCA formulation, because it loses the theoretical connection to the minimization of data reconstruction error, which is one of the most important intuitions and goals of PCA. In this paper, we propose to maximize the L21-norm based robust PCA objective, which is theoretically connected to the minimization of reconstruction error. More importantly, we propose the efficient non-greedy optimization algorithms to solve our objective and the more general L21-norm maximization problem with theoretically guaranteed convergence. Experimental results on real world data sets show the effectiveness of the proposed method for principal component analysis.
引用
收藏
页码:5277 / 5286
页数:10
相关论文
共 50 条
  • [31] Manifold Regularized Principal Component Analysis Method Using L2,p-Norm
    Wan, Minghua
    Wang, Xichen
    Tan, Hai
    Yang, Guowei
    MATHEMATICS, 2022, 10 (23)
  • [32] On principal component analysis in L1
    Li, BB
    Martin, EB
    Morris, AJ
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2002, 40 (03) : 471 - 474
  • [33] Tensor Robust Principal Component Analysis with a New Tensor Nuclear Norm
    Lu, Canyi
    Feng, Jiashi
    Chen, Yudong
    Liu, Wei
    Lin, Zhouchen
    Yan, Shuicheng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (04) : 925 - 938
  • [34] Linear Discriminant Analysis Based on L1-Norm Maximization
    Zhong, Fujin
    Zhang, Jiashu
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2013, 22 (08) : 3018 - 3027
  • [35] Nuclear norm based two-dimensional sparse principal component analysis
    Chen, Yudong
    Lai, Zhihui
    Wen, Jiajun
    Gao, Can
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2018, 16 (02)
  • [36] SLIC Superpixel-Based l2,1-Norm Robust Principal Component Analysis for Hyperspectral Image Classification
    Zu, Baokai
    Xia, Kewen
    Li, Tiejun
    He, Ziping
    Li, Yafang
    Hou, Jingzhong
    Du, Wei
    SENSORS, 2019, 19 (03):
  • [37] GrIP-PCA: Grassmann Iterative P-Norm Principal Component Analysis
    Minnehan, Breton
    Nagananda, Navya
    Savakis, Andreas
    IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 90 - 98
  • [38] Incremental expectation maximization principal component analysis for missing value imputation for coevolving EEG data
    Kim, Sun Hee
    Yang, Hyung Jeong
    Ng, Kam Swee
    JOURNAL OF ZHEJIANG UNIVERSITY-SCIENCE C-COMPUTERS & ELECTRONICS, 2011, 12 (08): : 687 - 697
  • [39] Incremental expectation maximization principal component analysis for missing value imputation for coevolving EEG data
    Sun Hee KIM
    Hyung Jeong YANG
    Kam Swee NG
    Frontiers of Information Technology & Electronic Engineering, 2011, 12 (08) : 687 - 697
  • [40] Non-Euclidean principal component analysis by Hebbian learning
    Lange, Mandy
    Biehl, Michael
    Villmann, Thomas
    NEUROCOMPUTING, 2015, 147 : 107 - 119