Graph Regularized Low-Rank Tensor-Train for Robust Principal Component Analysis

被引:2
|
作者
Sofuoglu, Seyyid Emre [1 ]
Aviyente, Selin [1 ]
机构
[1] Michigan State Univ, Dept Elect & Comp Engn, E Lansing, MI 48824 USA
关键词
Tensors; principal component analysis; geometric modeling; robustness;
D O I
10.1109/LSP.2022.3170251
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the advance of sensor technology, it is becoming more commonplace to collect multi-mode data, i.e., tensors, with high dimensionality. To deal with the large amounts of redundancy in tensorial data, different dimensionality reduction methods such as low-rank tensor decomposition have been developed. While low-rank decompositions capture the global structure, there is a need to leverage the underlying local geometry through manifold learning methods. Manifold learning methods have been widely considered in tensor factorization to incorporate the low-dimensional geometry of the underlying data. However, existing techniques focus on only one mode of the data and exploit correlations among the features to reduce the dimension of the feature vectors. Recently, multiway graph signal processing approaches that exploit the correlations among all modes of a tensor have been proposed to learn low-dimensional representations. Inspired by this idea, in this letter we propose a graph regularized robust tensor-train decomposition method where the graph regularization is applied across each mode of the tensor to incorporate the local geometry. As the resulting optimization problem is computationally prohibitive due to the high dimensionality of the graph regularization terms, an equivalence between mode-n canonical unfolding and regular mode-n unfolding is derived resulting in a computationally efficient optimization algorithm. The proposed method is evaluated on both synthetic and real tensors for denoising and tensor completion.
引用
收藏
页码:1152 / 1156
页数:5
相关论文
共 50 条
  • [41] Low-Rank Regularized Heterogeneous Tensor Decomposition for Subspace Clustering
    Zhang, Jing
    Li, Xinhui
    Jing, Peiguang
    Liu, Jing
    Su, Yuting
    IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (03) : 333 - 337
  • [42] Robust tensor train component analysis
    Zhang, Xiongjun
    Ng, Michael K.
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2022, 29 (01)
  • [43] Low tensor-train rank with total variation for magnetic resonance imaging reconstruction
    QiPeng Chen
    JianTing Cao
    Science China Technological Sciences, 2021, 64 : 1854 - 1862
  • [44] Automated Synthesis of Low-rank Control Systems from sc-LTL Specifications using Tensor-Train Decompositions
    Alora, John Irvin
    Gorodetsky, Alex
    Karaman, Sertac
    Marzouk, Youssef
    Lowry, Nathan
    2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC), 2016, : 1131 - 1138
  • [45] Principal component analysis with tensor train subspace
    Wang, Wenqi
    Aggarwal, Vaneet
    Aeron, Shuchin
    PATTERN RECOGNITION LETTERS, 2019, 122 : 86 - 91
  • [46] Exploiting Low-Rank Tensor-Train Deep Neural Networks Based on Riemannian Gradient Descent With Illustrations of Speech Processing
    Qi, Jun
    Yang, Chao-Han Huck
    Chen, Pin-Yu
    Tejedor, Javier
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 633 - 642
  • [47] Low tensor-train rank with total variation for magnetic resonance imaging reconstruction
    CHEN QiPeng
    CAO JianTing
    Science China(Technological Sciences), 2021, 64 (09) : 1854 - 1862
  • [48] Low tensor-train rank with total variation for maginetic resonance imaging reconstruction
    Chen QiPeng
    Cao JianTing
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2021, 64 (09) : 1854 - 1862
  • [49] DISTRIBUTED PRINCIPAL COMPONENT ANALYSIS BASED ON RANDOMIZED LOW-RANK APPROXIMATION
    Wang, Xinjue
    Chen, Jie
    2020 IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, COMMUNICATIONS AND COMPUTING (IEEE ICSPCC 2020), 2020,
  • [50] Low tensor-train rank with total variation for magnetic resonance imaging reconstruction
    CHEN QiPeng
    CAO JianTing
    Science China(Technological Sciences), 2021, (09) : 1854 - 1862