Robust Kronecker Component Analysis

被引:20
作者
Bahri, Mehdi [1 ]
Panagakis, Yannis [1 ,2 ]
Zafeiriou, Stefanos [1 ,3 ]
机构
[1] Imperial Coll London, Dept Comp, London SW7 2RH, England
[2] Middlesex Univ, London NW4 4BT, England
[3] Univ Oulu, Oulu 90014, Finland
基金
欧盟地平线“2020”; 英国工程与自然科学研究理事会;
关键词
Component analysis; dictionary learning; separable dictionaries; low-rank; sparsity; global optimality; SPARSE REPRESENTATION; ALGORITHM; RECOVERY; COMPLEX; MODELS;
D O I
10.1109/TPAMI.2018.2881476
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dictionary learning and component analysis models are fundamental for learning compact representations that are relevant to a given task (feature extraction, dimensionality reduction, denoising, etc.). The model complexity is encoded by means of specific structure, such as sparsity, low-rankness, or nonnegativity. Unfortunately, approaches like K-SVD - that learn dictionaries for sparse coding via Singular Value Decomposition (SVD) - are hard to scale to high-volume and high-dimensional visual data, and fragile in the presence of outliers. Conversely, robust component analysis methods such as the Robust Principal Component Analysis (RPCA) are able to recover low-complexity (e.g., low-rank) representations from data corrupted with noise of unknown magnitude and support, but do not provide a dictionary that respects the structure of the data (e.g., images), and also involve expensive computations. In this paper, we propose a novel Kronecker-decomposable component analysis model, coined as Robust Kronecker Component Analysis (RKCA), that combines ideas from sparse dictionary learning and robust component analysis. RKCA has several appealing properties, including robustness to gross corruption; it can be used for low-rank modeling, and leverages separability to solve significantly smaller problems. We design an efficient learning algorithm by drawing links with a restricted form of tensor factorization, and analyze its optimality and low-rankness properties. The effectiveness of the proposed approach is demonstrated on real-world applications, namely background subtraction and image denoising and completion, by performing a thorough comparison with the current state of the art.
引用
收藏
页码:2365 / 2379
页数:15
相关论文
共 68 条
[1]   K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation [J].
Aharon, Michal ;
Elad, Michael ;
Bruckstein, Alfred .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2006, 54 (11) :4311-4322
[2]  
Anandkumar A, 2016, JMLR WORKSH CONF PRO, V51, P268
[3]  
[Anonymous], 2004, Matrix Analysis For Scientists And En- gineers
[4]   Robust Kronecker-Decomposable Component Analysis for Low-Rank Modeling [J].
Bahri, Mehdi ;
Panagakis, Yannis ;
Zafeiriou, Stefanos .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :3372-3381
[5]   Lambertian reflectance and linear subspaces [J].
Basri, R ;
Jacobs, DW .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2003, 25 (02) :218-233
[6]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[7]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[8]   ANALYSIS OF INDIVIDUAL DIFFERENCES IN MULTIDIMENSIONAL SCALING VIA AN N-WAY GENERALIZATION OF ECKART-YOUNG DECOMPOSITION [J].
CARROLL, JD ;
CHANG, JJ .
PSYCHOMETRIKA, 1970, 35 (03) :283-&
[9]   AN IMPROVED ALGORITHM FOR COMPUTING THE SINGULAR VALUE DECOMPOSITION [J].
CHAN, TF .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 1982, 8 (01) :72-83
[10]   Robust Tensor Factorization with Unknown Noise [J].
Chen, Xiai ;
Han, Zhi ;
Wang, Yao ;
Zhao, Qian ;
Meng, Deyu ;
Tang, Yandon .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :5213-5221