Graph Regularized Low-Rank Tensor-Train for Robust Principal Component Analysis

被引:2
|
作者
Sofuoglu, Seyyid Emre [1 ]
Aviyente, Selin [1 ]
机构
[1] Michigan State Univ, Dept Elect & Comp Engn, E Lansing, MI 48824 USA
关键词
Tensors; principal component analysis; geometric modeling; robustness;
D O I
10.1109/LSP.2022.3170251
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the advance of sensor technology, it is becoming more commonplace to collect multi-mode data, i.e., tensors, with high dimensionality. To deal with the large amounts of redundancy in tensorial data, different dimensionality reduction methods such as low-rank tensor decomposition have been developed. While low-rank decompositions capture the global structure, there is a need to leverage the underlying local geometry through manifold learning methods. Manifold learning methods have been widely considered in tensor factorization to incorporate the low-dimensional geometry of the underlying data. However, existing techniques focus on only one mode of the data and exploit correlations among the features to reduce the dimension of the feature vectors. Recently, multiway graph signal processing approaches that exploit the correlations among all modes of a tensor have been proposed to learn low-dimensional representations. Inspired by this idea, in this letter we propose a graph regularized robust tensor-train decomposition method where the graph regularization is applied across each mode of the tensor to incorporate the local geometry. As the resulting optimization problem is computationally prohibitive due to the high dimensionality of the graph regularization terms, an equivalence between mode-n canonical unfolding and regular mode-n unfolding is derived resulting in a computationally efficient optimization algorithm. The proposed method is evaluated on both synthetic and real tensors for denoising and tensor completion.
引用
收藏
页码:1152 / 1156
页数:5
相关论文
共 50 条
  • [31] GRAPH REGULARIZED LOW-RANK MATRIX RECOVERY FOR ROBUST PERSON RE-IDENTIFICATION
    Tsai, Ming-Chia
    Wei, Chia-Po
    Wang, Yu-Chiang Frank
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 4654 - 4658
  • [32] Graph regularized low-rank representation for submodule clustering
    Wu, Tong
    PATTERN RECOGNITION, 2020, 100
  • [33] Graph-Regularized Generalized Low-Rank Models
    Paradkar, Mihir
    Udell, Madeleine
    2017 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2017, : 1921 - 1926
  • [34] Low-Rank Tensor Regularized Graph Fuzzy Learning for Multi-View Data Processing
    Pan, Baicheng
    Li, Chuandong
    Che, Hangjun
    Leung, Man-Fai
    Yu, Keping
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 2925 - 2938
  • [35] Tensor-Train Discriminant Analysis
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3422 - 3426
  • [36] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [37] Online Robust Low-Rank Tensor Learning
    Li, Ping
    Feng, Jiashi
    Jin, Xiaojie
    Zhang, Luming
    Xu, Xianghua
    Yan, Shuicheng
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2180 - 2186
  • [38] Sparse regularized low-rank tensor regression with applications in genomic data analysis
    Le Ou-Yang
    Zhang, Xiao-Fei
    Yan, Hong
    PATTERN RECOGNITION, 2020, 107
  • [39] Low-Rank Tensor Regularized Fuzzy Clustering for Multiview Data
    Wei, Huiqin
    Chen, Long
    Ruan, Keyu
    Li, Lingxi
    Chen, Long
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2020, 28 (12) : 3087 - 3099
  • [40] Mixed norm regularized models for low-rank tensor completion
    Bu, Yuanyang
    Zhao, Yongqiang
    Chan, Jonathan Cheung-Wai
    INFORMATION SCIENCES, 2024, 670