Inference algorithms and learning theory for Bayesian sparse factor analysis

被引:11
作者
Rattray, Magnus [1 ]
Stegle, Oliver [2 ,3 ]
Sharp, Kevin [1 ]
Winn, John [4 ]
机构
[1] Univ Manchester, Sch Comp Sci, Manchester M13 9PL, Lancs, England
[2] Max Planck Inst Biol Cybernet, Tubingen, Germany
[3] Max Planck Inst Dev Biol, Tubingen, Germany
[4] Microsoft Res Cambridge, Cambridge CB3 0FB, England
来源
INTERNATIONAL WORKSHOP ON STATISTICAL-MECHANICAL INFORMATICS 2009 (IW-SMI 2009) | 2009年 / 197卷
关键词
D O I
10.1088/1742-6596/197/1/012002
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.
引用
收藏
页数:10
相关论文
共 18 条
[1]  
[Anonymous], 2002, Probability and Statistics
[2]  
Engel A., 2001, Statistical Mechanics of Learning
[3]  
FREY BJ, 2000, ADV NEURAL INFORM PR, V12, P493
[4]   STOCHASTIC RELAXATION, GIBBS DISTRIBUTIONS, AND THE BAYESIAN RESTORATION OF IMAGES [J].
GEMAN, S ;
GEMAN, D .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1984, 6 (06) :721-741
[5]   VARIABLE SELECTION VIA GIBBS SAMPLING [J].
GEORGE, EI ;
MCCULLOCH, RE .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1993, 88 (423) :881-889
[6]   An introduction to variational methods for graphical models [J].
Jordan, MI ;
Ghahramani, Z ;
Jaakkola, TS ;
Saul, LK .
MACHINE LEARNING, 1999, 37 (02) :183-233
[7]   ON THE GENERALIZATION ABILITY OF DILUTED PERCEPTRONS [J].
KUHLMANN, P ;
MULLER, KR .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1994, 27 (11) :3759-3774
[8]  
Minka T., 2001, P 17 C UNC ART INT, P362
[9]  
Minka T. P., 2001, A family of algorithms for approximate bayesian inference
[10]  
Nickisch H, 2008, J MACH LEARN RES, V9, P2035