A review of deterministic approximate inference techniques for Bayesian machine learning

被引:32
作者
Sun, Shiliang [1 ]
机构
[1] E China Normal Univ, Dept Comp Sci & Technol, Shanghai 200241, Peoples R China
基金
中国国家自然科学基金;
关键词
Uncertainty; Probabilistic models; Bayesian machine learning; Posterior distribution; Deterministic approximate inference; VARIATIONAL INFERENCE; MODELS; PROPAGATION;
D O I
10.1007/s00521-013-1445-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A central task of Bayesian machine learning is to infer the posterior distribution of hidden random variables given observations and calculate expectations with respect to this distribution. However, this is often computationally intractable so that people have to seek approximation schemes. Deterministic approximate inference techniques are an alternative of the stochastic approximate inference methods based on numerical sampling, namely Monte Carlo techniques, and during the last 15 years, many advancements in this field have been made. This paper reviews typical deterministic approximate inference techniques, some of which are very recent and need further explorations. With an aim to promote research in deterministic approximate inference, we also attempt to identify open problems that may be helpful for future investigations in this field.
引用
收藏
页码:2039 / 2050
页数:12
相关论文
共 89 条
[1]  
Alvarez M., 2010, P 13 INT C ARTIFICIA, P25
[2]   An introduction to MCMC for machine learning [J].
Andrieu, C ;
de Freitas, N ;
Doucet, A ;
Jordan, MI .
MACHINE LEARNING, 2003, 50 (1-2) :5-43
[3]  
[Anonymous], 2011, P 14 INT C ART INT S
[4]  
[Anonymous], ADV NEURAL INF PROCE
[5]  
[Anonymous], 2001, THESIS MIT CAMBRIDGE
[6]  
[Anonymous], J MACH LEAR IN PRESS
[7]  
[Anonymous], P INT C MACH LEARN C
[8]  
[Anonymous], P 29 INT C MACH LEAR
[9]  
[Anonymous], 2012, Bayesian Reasoning and Machine Learning
[10]  
[Anonymous], 2013, PROC IEEE INT C SIGN