Tensor Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Tensors via Convex Optimization

被引:436
作者
Lu, Canyi [1 ]
Feng, Jiashi [1 ]
Chen, Yudong [2 ]
Liu, Wei [3 ]
Lin, Zhouchen [4 ,5 ]
Yan, Shuicheng [1 ]
机构
[1] Natl Univ Singapore, Dept Elect & Comp Engn, Singapore, Singapore
[2] Cornell Univ, Sch Operat Res & Informat Engn, Ithaca, NY 14853 USA
[3] Didi Res, Beijing, Peoples R China
[4] Peking Univ, Sch EECS, Key Lab Machine Percept MOE, Beijing, Peoples R China
[5] Shanghai Jiao Tong Univ, Cooperat Medianet Innovat Ctr, Shanghai, Peoples R China
来源
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2016年
基金
新加坡国家研究基金会;
关键词
FACTORIZATION; FRAMEWORK; MODELS;
D O I
10.1109/CVPR.2016.567
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper studies the Tensor Robust Principal Component (TRPCA) problem which extends the known Robust PCA [4] to the tensor case. Our model is based on a new tensor Singular Value Decomposition (t-SVD) [14] and its induced tensor tubal rank and tensor nuclear norm. Consider that we have a 3-way tensor X is an element of R-n1xn2xn3 such that X = L-0 + S-0, where L-0 has low tubal rank and S-0 is sparse. Is that possible to recover both components? In this work, we prove that under certain suitable assumptions, we can recover both the low-rank and the sparse components exactly by simply solving a convex program whose objective is a weighted combination of the tensor nuclear norm and the l(1)-norm, i.e., min(L,E) parallel to L parallel to(*) + lambda parallel to E parallel to(1), s.t. X = L + E, where lambda = 1/root max(n1, n2)n3. Interestingly, TRPCA involves RPCA as a special case when n(3) = 1 and thus it is a simple and elegant tensor extension of RPCA. Also numerical experiments verify our theory and the application for the image denoising demonstrates the effectiveness of our method.
引用
收藏
页码:5249 / 5257
页数:9
相关论文
共 28 条
[1]  
[Anonymous], 2002, Principal components analysis
[2]  
[Anonymous], 2015, ARXIV14043840
[3]  
[Anonymous], 2015, ARXIV150204689
[4]  
[Anonymous], 2010, ARXIV10100789
[5]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[6]   Third-order tensors as linear operators on a space of matrices [J].
Braman, Karen .
LINEAR ALGEBRA AND ITS APPLICATIONS, 2010, 433 (07) :1241-1253
[7]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[8]   Exact Matrix Completion via Convex Optimization [J].
Candes, Emmanuel J. ;
Recht, Benjamin .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2009, 9 (06) :717-772
[9]   Incoherence-Optimal Matrix Completion [J].
Chen, Yudong .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2015, 61 (05) :2909-2923
[10]   Tensor completion and low-n-rank tensor recovery via convex optimization [J].
Gandy, Silvia ;
Recht, Benjamin ;
Yamada, Isao .
INVERSE PROBLEMS, 2011, 27 (02)