Robust Multiview Subspace Learning With Nonindependently and Nonidentically Distributed Complex Noise

被引:23
作者
Yue, Zongsheng [1 ]
Yong, Hongwei [2 ]
Meng, Deyu [1 ]
Zhao, Qian [1 ]
Leung, Yee [3 ]
Zhang, Lei [2 ]
机构
[1] Xi An Jiao Tong Univ, Key Lab Intelligent Networks & Network Secur, Minist Educ, Inst Informat & Syst Sci, Xian 710049, Peoples R China
[2] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Peoples R China
[3] Chinese Univ Hong Kong, Inst Future Cities, Dept Geog & Resource Management, Hong Kong, Peoples R China
关键词
Data models; Laplace equations; Adaptation models; Distributed databases; Feature extraction; Correlation; Robustness; Dirichlet process (DP) mixture model; hierarchical Dirichlet process (HDP); multiview; subspace learning; variational Bayes; MATRIX FACTORIZATION; REGRESSION;
D O I
10.1109/TNNLS.2019.2917328
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiview Subspace Learning (MSL), which aims at obtaining a low-dimensional latent subspace from multiview data, has been widely used in practical applications. Most recent MSL approaches, however, only assume a simple independent identically distributed (i.i.d.) Gaussian or Laplacian noise for all views of data, which largely underestimates the noise complexity in practical multiview data. Actually, in real cases, noises among different views generally have three specific characteristics. First, in each view, the data noise always has a complex configuration beyond a simple Gaussian or Laplacian distribution. Second, the noise distributions of different views of data are generally nonidentical and with evident distinctiveness. Third, noises among all views are nonindependent but obviously correlated. Based on such understandings, we elaborately construct a new MSL model by more faithfully and comprehensively considering all these noise characteristics. First, the noise in each view is modeled as a Dirichlet process (DP) Gaussian mixture model (DPGMM), which can fit a wider range of complex noise types than conventional Gaussian or Laplacian. Second, the DPGMM parameters in each view are different from one another, which encodes the "nonidentical" noise property. Third, the DPGMMs on all views share the same high-level priors by using the technique of hierarchical DP, which encodes the "nonindependent" noise property. All the aforementioned ideas are incorporated into an integrated graphics model which can be appropriately solved by the variational Bayes algorithm. The superiority of the proposed method is verified by experiments on 3-D reconstruction simulations, multiview face modeling, and background subtraction, as compared with the current state-of-the-art MSL methods.
引用
收藏
页码:1070 / 1083
页数:14
相关论文
共 47 条
[1]  
[Anonymous], 2006, Advances in Neural Information Processing Systems
[2]  
[Anonymous], IEEE T MED IMAG
[3]  
[Anonymous], 2012, NIPS
[4]  
[Anonymous], 2006, KERNEL METHOD CANONI
[5]  
[Anonymous], 2007, MACHINE LEARNING MUL, DOI DOI 10.1007/978-3-540-78155-4
[6]  
Archambeau Cedric., 2009, P 21 INT C NEUR INF, P73
[7]   Sparse Bayesian Methods for Low-Rank Matrix Estimation [J].
Babacan, S. Derin ;
Luessi, Martin ;
Molina, Rafael ;
Katsaggelos, Aggelos K. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (08) :3964-3977
[8]  
Bach F.R., 2005, A probabilistic interpretation of canonical correlation analysis, P688
[9]  
Bishop C.M., 2006, Pattern recognition and machine learning
[10]   Variational Inference for Dirichlet Process Mixtures [J].
Blei, David M. ;
Jordan, Michael I. .
BAYESIAN ANALYSIS, 2006, 1 (01) :121-143