Robust L1 principal component analysis and its Bayesian variational inference

被引:78
作者
Gao, Junbin [1 ]
机构
[1] Charles Sturt Univ, Sch Comp Sci, Bathurst, NSW 2795, Australia
关键词
D O I
10.1162/neco.2007.11-06-397
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce a robust probabilistic L1-PCA model in which the conventional gaussian distribution for the noise in the observed data was replaced by the Laplacian distribution (or L1 distribution). Due to the heavy tail characteristics of the L1 distribution, the proposed model is supposed to be more robust against data outliers. In this letter, we demonstrate how a variational approximation scheme enables effective inference of key parameters in the probabilistic L1-PCA model. As the L1 density can be expanded as a superposition of infinite number of gaussian densities, we express the L1-PCA model as a marginalized model over the superpositions. By doing so, a tractable Bayesian inference can be achieved based on the variational expectation-maximization-type algorithm.
引用
收藏
页码:555 / 572
页数:18
相关论文
共 31 条
[1]  
[Anonymous], 2006, R1 PCA ROTATIONAL IN
[2]  
[Anonymous], P INT C MACH LEARN
[3]  
[Anonymous], ADV NEURAL INFORM PR
[4]  
ARCHAMBEAU C, 2005, THESIS U CATHOLIQUE
[5]  
Archambeau C., 2006, P 23 INT C MACH LEAR
[6]  
Baccini A, 1996, ST CLASS DAT ANAL, P359
[7]  
BLACK MJ, 1996, P ECCV, V1
[8]   A Framework for Robust Subspace Learning [J].
Fernando De la Torre ;
Michael J. Black .
International Journal of Computer Vision, 2003, 54 (1-3) :117-142
[9]  
DELATORRE F, 2001, INT C COMPUTER VISIO, V52, P362
[10]   MAXIMUM LIKELIHOOD FROM INCOMPLETE DATA VIA EM ALGORITHM [J].
DEMPSTER, AP ;
LAIRD, NM ;
RUBIN, DB .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1977, 39 (01) :1-38