Robust Low-Rank Tensor Recovery with Rectification and Alignment

被引:148
作者
Zhang, Xiaoqin [1 ]
Wang, Di [1 ]
Zhou, Zhengyuan [2 ]
Ma, Yi [3 ]
机构
[1] Wenzhou Univ, Coll Comp Sci & Artificial Intelligence, Wenzhou 325035, Zhejiang, Peoples R China
[2] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[3] Univ Calif Berkeley, Dept Elect Engn & Comp Sci, Berkeley, CA 94720 USA
基金
中国国家自然科学基金;
关键词
Low-rank tensor recovery; rectification; alignment; ADMM; proximal gradient; MATRIX COMPLETION; MODELS; FACTORIZATION; ALGORITHM;
D O I
10.1109/TPAMI.2019.2929043
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank tensor recovery in the presence of sparse but arbitrary errors is an important problem with many practical applications. In this work, we propose a general framework that recovers low-rank tensors, in which the data can be deformed by some unknown transformations and corrupted by arbitrary sparse errors. We give a unified presentation of the surrogate-based formulations that incorporate the features of rectification and alignment simultaneously, and establish worst-case error bounds of the recovered tensor. In this context, the state-of-the-art methods 'RASL' and 'TILT' can be viewed as two special cases of our work, and yet each only performs part of the function of our method. Subsequently, we study the optimization aspects of the problem in detail by deriving two algorithms, one based on the alternating direction method of multipliers (ADMM) and the other based on proximal gradient. We provide convergence guarantees for the latter algorithm, and demonstrate the performance of the former through in-depth simulations. Finally, we present extensive experimental results on public datasets to demonstrate the effectiveness and efficiency of the proposed framework and algorithms.
引用
收藏
页码:238 / 255
页数:18
相关论文
共 50 条
  • [11] ROBUST HIGH-ORDER TENSOR RECOVERY VIA NONCONVEX LOW-RANK APPROXIMATION
    Qin, Wenjin
    Wang, Hailin
    Ma, Weijun
    Wang, Jianjun
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3633 - 3637
  • [12] PROVABLE MODELS FOR ROBUST LOW-RANK TENSOR COMPLETION
    Huang, Bo
    Mu, Cun
    Goldfarb, Donald
    Wright, John
    PACIFIC JOURNAL OF OPTIMIZATION, 2015, 11 (02): : 339 - 364
  • [13] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [14] Low-rank tensor train for tensor robust principal component analysis
    Yang, Jing-Hua
    Zhao, Xi-Le
    Ji, Teng-Yu
    Ma, Tian-Hui
    Huang, Ting-Zhu
    APPLIED MATHEMATICS AND COMPUTATION, 2020, 367
  • [15] Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
    Bengua, Johann A.
    Phien, Ho N.
    Hoang Duong Tuan
    Do, Minh N.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (05) : 2466 - 2479
  • [16] NONCONVEX ROBUST LOW-RANK MATRIX RECOVERY
    Li, Xiao
    Zhu, Zhihui
    So, Anthony Man-Cho
    Vidal, Rene
    SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 660 - 686
  • [17] Tensor Recovery via Nonconvex Low-Rank Approximation
    Chen, Lin
    Jiang, Xue
    Liu, Xingzhao
    Zhou, Zhixin
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 710 - 714
  • [18] Robust Generalized Low-Rank Decomposition of Multimatrices for Image Recovery
    Wang, Hengyou
    Cen, Yigang
    He, Zhihai
    Zhao, Ruizhen
    Cen, Yi
    Zhang, Fengzhen
    IEEE TRANSACTIONS ON MULTIMEDIA, 2017, 19 (05) : 969 - 983
  • [19] Low-rank tensor completion for visual data recovery via the tensor train rank-1 decomposition
    Liu, Xiaohua
    Jing, Xiao-Yuan
    Tang, Guijin
    Wu, Fei
    Dong, Xiwei
    IET IMAGE PROCESSING, 2020, 14 (01) : 114 - 124
  • [20] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    INFORMATION SCIENCES, 2022, 616 : 303 - 329