Robust Low-Rank Tensor Recovery with Rectification and Alignment

被引:148
作者
Zhang, Xiaoqin [1 ]
Wang, Di [1 ]
Zhou, Zhengyuan [2 ]
Ma, Yi [3 ]
机构
[1] Wenzhou Univ, Coll Comp Sci & Artificial Intelligence, Wenzhou 325035, Zhejiang, Peoples R China
[2] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[3] Univ Calif Berkeley, Dept Elect Engn & Comp Sci, Berkeley, CA 94720 USA
基金
中国国家自然科学基金;
关键词
Low-rank tensor recovery; rectification; alignment; ADMM; proximal gradient; MATRIX COMPLETION; MODELS; FACTORIZATION; ALGORITHM;
D O I
10.1109/TPAMI.2019.2929043
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank tensor recovery in the presence of sparse but arbitrary errors is an important problem with many practical applications. In this work, we propose a general framework that recovers low-rank tensors, in which the data can be deformed by some unknown transformations and corrupted by arbitrary sparse errors. We give a unified presentation of the surrogate-based formulations that incorporate the features of rectification and alignment simultaneously, and establish worst-case error bounds of the recovered tensor. In this context, the state-of-the-art methods 'RASL' and 'TILT' can be viewed as two special cases of our work, and yet each only performs part of the function of our method. Subsequently, we study the optimization aspects of the problem in detail by deriving two algorithms, one based on the alternating direction method of multipliers (ADMM) and the other based on proximal gradient. We provide convergence guarantees for the latter algorithm, and demonstrate the performance of the former through in-depth simulations. Finally, we present extensive experimental results on public datasets to demonstrate the effectiveness and efficiency of the proposed framework and algorithms.
引用
收藏
页码:238 / 255
页数:18
相关论文
共 50 条
  • [31] Image inpainting by low-rank tensor decomposition and multidirectional search
    Liu, Xuya
    Hao, Caiyan
    Su, Zezhao
    Qi, Zerong
    Fu, Shujun
    Li, Yuliang
    Han, Hongbin
    JOURNAL OF ELECTRONIC IMAGING, 2021, 30 (05)
  • [32] Online Tensor Low-Rank Representation for Streaming Data Clustering
    Wu, Tong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (02) : 602 - 617
  • [33] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [34] Smooth low-rank representation with a Grassmann manifold for tensor completion
    Su, Liyu
    Liu, Jing
    Zhang, Jianting
    Tian, Xiaoqing
    Zhang, Hailang
    Ma, Chaoqun
    KNOWLEDGE-BASED SYSTEMS, 2023, 270
  • [35] Affine Subspace Robust Low-Rank Self-Representation: From Matrix to Tensor
    Tang, Yongqiang
    Xie, Yuan
    Zhang, Wensheng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 9357 - 9373
  • [36] Constrained Low-Rank Representation for Robust Subspace Clustering
    Wang, Jing
    Wang, Xiao
    Tian, Feng
    Liu, Chang Hong
    Yu, Hongchuan
    IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (12) : 4534 - 4546
  • [37] Nonconvex Low-Rank Tensor Completion from Noisy Data
    Cai, Changxiao
    Li, Gen
    Poor, H. Vincent
    Chen, Yuxin
    OPERATIONS RESEARCH, 2022, 70 (02) : 1219 - 1237
  • [38] Image Completion with Filtered Low-Rank Tensor Train Approximations
    Zdunek, Rafal
    Fonal, Krzysztof
    Sadowski, Tomasz
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2019, PT II, 2019, 11507 : 235 - 245
  • [39] Robust Low-Rank Tensor Decomposition with the L2 Criterion
    Heng, Qiang
    Chi, Eric C.
    Liu, Yufeng
    TECHNOMETRICS, 2023, 65 (04) : 537 - 552
  • [40] Robust Low-Rank Analysis with Adaptive Weighted Tensor for Image Denoising
    Zhang, Lei
    Liu, Cong
    DISPLAYS, 2022, 73