Robust tensor train component analysis

被引:2
|
作者
Zhang, Xiongjun [1 ,2 ]
Ng, Michael K. [3 ]
机构
[1] Cent China Normal Univ, Sch Math & Stat, Wuhan 430079, Peoples R China
[2] Cent China Normal Univ, Hubei Key Lab Math Sci, Wuhan 430079, Peoples R China
[3] Univ Hong Kong, Dept Math, Pokfulam, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Low-rank tensor; Nuclear norm; Proximal alternating direction method of multipliers; Robust principal component analysis; Tensor train decomposition; RANK MINIMIZATION; FACE RECOGNITION; MATRIX FACTORIZATION; COMPLETION; MODELS; DECOMPOSITIONS; APPROXIMATION;
D O I
10.1002/nla.2403
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Robust Principal Component Analysis plays a key role in various fields such as image and video processing, data mining, and hyperspectral data analysis. In this paper, we study the problem of robust tensor train (TT) principal component analysis from partial observations, which aims to decompose a given tensor into the low TT rank and sparse components. The decomposition of the proposed model is used to find the hidden factors and help alleviate the curse of dimensionality via a set of connected low-rank tensors. A relaxation model is to minimize a weighted combination of the sum of nuclear norms of unfolding matrices of core tensors and the tensor l1 norm. A proximal alternating direction method of multipliers is developed to solve the resulting model. Furthermore, we show that any cluster point of the convergent subsequence is a Karush-Kuhn-Tucker point of the proposed model under some conditions. Extensive numerical examples on both synthetic data and real-world datasets are presented to demonstrate the effectiveness of the proposed approach.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] Low-rank tensor train for tensor robust principal component analysis
    Yang, Jing-Hua
    Zhao, Xi-Le
    Ji, Teng-Yu
    Ma, Tian-Hui
    Huang, Ting-Zhu
    APPLIED MATHEMATICS AND COMPUTATION, 2020, 367
  • [2] Principal component analysis with tensor train subspace
    Wang, Wenqi
    Aggarwal, Vaneet
    Aeron, Shuchin
    PATTERN RECOGNITION LETTERS, 2019, 122 : 86 - 91
  • [3] Robust principal component analysis based on tensor train rank and Schatten p-norm
    Zhang, Pengcheng
    Geng, Juan
    Liu, Yapeng
    Yang, Shouxin
    VISUAL COMPUTER, 2023, 39 (11): : 5849 - 5867
  • [4] Robust principal component analysis based on tensor train rank and Schatten p-norm
    Pengcheng Zhang
    Juan Geng
    Yapeng Liu
    Shouxin Yang
    The Visual Computer, 2023, 39 : 5849 - 5867
  • [5] Graph Regularized Low-Rank Tensor-Train for Robust Principal Component Analysis
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1152 - 1156
  • [6] Robust block tensor principal component analysis
    Feng, Lanlan
    Liu, Yipeng
    Chen, Longxi
    Zhang, Xiang
    Zhu, Ce
    SIGNAL PROCESSING, 2020, 166
  • [7] QUANTIZED TENSOR ROBUST PRINCIPAL COMPONENT ANALYSIS
    Aidini, Anastasia
    Tsagkatakis, Grigorios
    Tsakalides, Panagiotis
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 2453 - 2457
  • [8] Online Tensor Robust Principal Component Analysis
    Salut, Mohammad M.
    Anderson, David, V
    IEEE ACCESS, 2022, 10 : 69354 - 69363
  • [9] Tensor Robust Principal Component Analysis with a New Tensor Nuclear Norm
    Lu, Canyi
    Feng, Jiashi
    Chen, Yudong
    Liu, Wei
    Lin, Zhouchen
    Yan, Shuicheng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (04) : 925 - 938
  • [10] Convex–Concave Tensor Robust Principal Component Analysis
    Youfa Liu
    Bo Du
    Yongyong Chen
    Lefei Zhang
    Mingming Gong
    Dacheng Tao
    International Journal of Computer Vision, 2024, 132 : 1721 - 1747