Robust Low-Rank Tensor Recovery with Rectification and Alignment

被引:155
作者
Zhang, Xiaoqin [1 ]
Wang, Di [1 ]
Zhou, Zhengyuan [2 ]
Ma, Yi [3 ]
机构
[1] Wenzhou Univ, Coll Comp Sci & Artificial Intelligence, Wenzhou 325035, Zhejiang, Peoples R China
[2] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[3] Univ Calif Berkeley, Dept Elect Engn & Comp Sci, Berkeley, CA 94720 USA
基金
中国国家自然科学基金;
关键词
Low-rank tensor recovery; rectification; alignment; ADMM; proximal gradient; MATRIX COMPLETION; MODELS; FACTORIZATION; ALGORITHM;
D O I
10.1109/TPAMI.2019.2929043
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank tensor recovery in the presence of sparse but arbitrary errors is an important problem with many practical applications. In this work, we propose a general framework that recovers low-rank tensors, in which the data can be deformed by some unknown transformations and corrupted by arbitrary sparse errors. We give a unified presentation of the surrogate-based formulations that incorporate the features of rectification and alignment simultaneously, and establish worst-case error bounds of the recovered tensor. In this context, the state-of-the-art methods 'RASL' and 'TILT' can be viewed as two special cases of our work, and yet each only performs part of the function of our method. Subsequently, we study the optimization aspects of the problem in detail by deriving two algorithms, one based on the alternating direction method of multipliers (ADMM) and the other based on proximal gradient. We provide convergence guarantees for the latter algorithm, and demonstrate the performance of the former through in-depth simulations. Finally, we present extensive experimental results on public datasets to demonstrate the effectiveness and efficiency of the proposed framework and algorithms.
引用
收藏
页码:238 / 255
页数:18
相关论文
共 47 条
[1]  
[Anonymous], 2011, ARXIV10100789
[2]  
[Anonymous], P IEEE C COMP VIS PA
[3]  
[Anonymous], 2017, Proc. IEEE Conf. Computer Vision and Pattern Recognition
[4]   Distributed Linearized Alternating Direction Method of Multipliers for Composite Convex Consensus Optimization [J].
Aybat, N. S. ;
Wang, Z. ;
Lin, T. ;
Ma, S. .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2018, 63 (01) :5-20
[5]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[6]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982
[7]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[8]   The Power of Convex Relaxation: Near-Optimal Matrix Completion [J].
Candes, Emmanuel J. ;
Tao, Terence .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (05) :2053-2080
[9]   Matrix Completion With Noise [J].
Candes, Emmanuel J. ;
Plan, Yaniv .
PROCEEDINGS OF THE IEEE, 2010, 98 (06) :925-936
[10]   Exact Matrix Completion via Convex Optimization [J].
Candes, Emmanuel J. ;
Recht, Benjamin .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2009, 9 (06) :717-772