Robust Low-Rank Tensor Recovery with Rectification and Alignment

被引:148
作者
Zhang, Xiaoqin [1 ]
Wang, Di [1 ]
Zhou, Zhengyuan [2 ]
Ma, Yi [3 ]
机构
[1] Wenzhou Univ, Coll Comp Sci & Artificial Intelligence, Wenzhou 325035, Zhejiang, Peoples R China
[2] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[3] Univ Calif Berkeley, Dept Elect Engn & Comp Sci, Berkeley, CA 94720 USA
基金
中国国家自然科学基金;
关键词
Low-rank tensor recovery; rectification; alignment; ADMM; proximal gradient; MATRIX COMPLETION; MODELS; FACTORIZATION; ALGORITHM;
D O I
10.1109/TPAMI.2019.2929043
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank tensor recovery in the presence of sparse but arbitrary errors is an important problem with many practical applications. In this work, we propose a general framework that recovers low-rank tensors, in which the data can be deformed by some unknown transformations and corrupted by arbitrary sparse errors. We give a unified presentation of the surrogate-based formulations that incorporate the features of rectification and alignment simultaneously, and establish worst-case error bounds of the recovered tensor. In this context, the state-of-the-art methods 'RASL' and 'TILT' can be viewed as two special cases of our work, and yet each only performs part of the function of our method. Subsequently, we study the optimization aspects of the problem in detail by deriving two algorithms, one based on the alternating direction method of multipliers (ADMM) and the other based on proximal gradient. We provide convergence guarantees for the latter algorithm, and demonstrate the performance of the former through in-depth simulations. Finally, we present extensive experimental results on public datasets to demonstrate the effectiveness and efficiency of the proposed framework and algorithms.
引用
收藏
页码:238 / 255
页数:18
相关论文
共 50 条
  • [41] ACCELERATING ILL-CONDITIONED ROBUST LOW-RANK TENSOR REGRESSION
    Tong, Tian
    Ma, Cong
    Chi, Yuejie
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 9072 - 9076
  • [42] Robust Tensor SVD and Recovery With Rank Estimation
    Shi, Qiquan
    Cheung, Yiu-Ming
    Lou, Jian
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (10) : 10667 - 10682
  • [43] Tensor Robust Principal Component Analysis with Low-Rank Weight Constraints for Sample Clustering
    Zhao, Yu-Ying
    Wang, Mao-Li
    Wang, Juan
    Yuan, Sha-Sha
    Liu, Jin-Xing
    Kong, Xiang-Zhen
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 397 - 401
  • [44] Low-Rank Tensor Completion for Image and Video Recovery via Capped Nuclear Norm
    Chen, Xi
    Li, Jie
    Song, Yun
    Li, Feng
    Chen, Jianjun
    Yang, Kun
    IEEE ACCESS, 2019, 7 : 112142 - 112153
  • [45] Low-Rank Matrix Recovery Via Robust Outlier Estimation
    Guo, Xiaojie
    Lin, Zhouchen
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (11) : 5316 - 5327
  • [46] t-Schatten-p Norm for Low-Rank Tensor Recovery
    Kong, Hao
    Xie, Xingyu
    Lin, Zhouchen
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (06) : 1405 - 1419
  • [47] Alternating Direction Method of Multipliers for Generalized Low-Rank Tensor Recovery
    Shi, Jiarong
    Yin, Qingyan
    Zheng, Xiuyun
    Yang, Wei
    ALGORITHMS, 2016, 9 (02)
  • [48] Robust Low-Tubal-Rank Tensor Recovery From Binary Measurements
    Hou, Jingyao
    Zhang, Feng
    Qiu, Haiquan
    Wang, Jianjun
    Wang, Yao
    Meng, Deyu
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (08) : 4355 - 4373
  • [49] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [50] MULTIRESOLUTION LOW-RANK TENSOR FORMATS
    Mickelin, Oscar
    Karaman, Sertac
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2020, 41 (03) : 1086 - 1114