A Generalized Model for Robust Tensor Factorization With Noise Modeling by Mixture of Gaussians

被引:54
作者
Chen, Xi'ai [1 ,2 ]
Han, Zhi [1 ]
Wang, Yao [3 ]
Zhao, Qian [3 ]
Meng, Deyu [3 ]
Lin, Lin [3 ]
Tang, Yandong [1 ]
机构
[1] Chinese Acad Sci, Shenyang Inst Automat, State Key Lab Robot, Shenyang 110016, Liaoning, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[3] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Shaanxi, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Expectation-maximization (EM) algorithm; generalized weighted low-rank tensor factorization (GWLRTF); mixture of Gaussians (MoG) model; tensor factorization; DIMENSIONALITY REDUCTION; IMAGES; RANK; RECOGNITION; MOTION;
D O I
10.1109/TNNLS.2018.2796606
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, L-1-norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods.
引用
收藏
页码:5380 / 5393
页数:14
相关论文
共 56 条
  • [1] [Anonymous], 1970, UCLA Working Papers in Phonetics, DOI DOI 10.1134/S0036023613040165
  • [2] [Anonymous], P BRIT MACH VIS C
  • [3] [Anonymous], 2003, PROC CVPR IEEE
  • [4] [Anonymous], THESIS
  • [5] [Anonymous], 2010, MAKING TENSOR FACTOR
  • [6] [Anonymous], 2010, PROC SIAM INT C DATA, DOI DOI 10.1137/1.9781611972801.19
  • [7] Bader B. W., 2015, MATLAB TENSOR TOOLBO
  • [8] Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection
    Belhumeur, PN
    Hespanha, JP
    Kriegman, DJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) : 711 - 720
  • [9] Robust Principal Component Analysis?
    Candes, Emmanuel J.
    Li, Xiaodong
    Ma, Yi
    Wright, John
    [J]. JOURNAL OF THE ACM, 2011, 58 (03)
  • [10] ANALYSIS OF INDIVIDUAL DIFFERENCES IN MULTIDIMENSIONAL SCALING VIA AN N-WAY GENERALIZATION OF ECKART-YOUNG DECOMPOSITION
    CARROLL, JD
    CHANG, JJ
    [J]. PSYCHOMETRIKA, 1970, 35 (03) : 283 - &