Variational Inference for Dirichlet Process Mixtures

被引:919
作者
Blei, David M. [1 ]
Jordan, Michael I. [2 ]
机构
[1] Carnegie Mellon Univ, Sch Comp Sci, Pittsburgh, PA 15213 USA
[2] Univ Calif Berkeley, Dept Comp Sci & Stat, Berkeley, CA 94720 USA
来源
BAYESIAN ANALYSIS | 2006年 / 1卷 / 01期
关键词
Dirichlet processes; hierarchical models; variational inference; image processing; Bayesian computation;
D O I
10.1214/06-BA104
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Dirichlet process (DP) mixture models are the cornerstone of non-parametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabled the application of non-parametric Bayesian methods to a variety of practical data analysis problems. However, MCMC sampling can be prohibitively slow,and it is important to explore alternatives.One class of alternatives is provided by variational methods, a class of deterministic algorithms that convert inference problems into optimization problems (Opper and Saad 2001; Wainwright and Jordan 2003).Thus far, variational methods have mainly been explored in the parametric setting, in particular within the formalism of the exponential family (Attias2000; Ghahramani and Beal 2001; Bleietal .2003).In this paper, we present a variational inference algorithm for DP mixtures.We present experiments that compare the algorithm to Gibbs sampling algorithms for DP mixtures of Gaussians and present an application to a large-scale image analysis problem.
引用
收藏
页码:121 / 143
页数:23
相关论文
共 50 条
  • [41] α-VARIATIONAL INFERENCE WITH STATISTICAL GUARANTEES
    Yang, Yun
    Pati, Debdeep
    Bhattacharya, Anirban
    ANNALS OF STATISTICS, 2020, 48 (02) : 886 - 905
  • [42] Structured Variational Inference in Partially Observable Unstable Gaussian Process State Space Models
    Melchior, Silvan
    Curi, Sebastian
    Berkenkamp, Felix
    Krause, Andreas
    LEARNING FOR DYNAMICS AND CONTROL, VOL 120, 2020, 120 : 147 - 157
  • [43] Probabilistic model updating via variational Bayesian inference and adaptive Gaussian process modeling
    Ni, Pinghe
    Li, Jun
    Hao, Hong
    Han, Qiang
    Du, Xiuli
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2021, 383 (383)
  • [44] Variational Textured Dirichlet Process Mixture Model With Pairwise Constraint for Unsupervised Classification of Polarimetric SAR Images
    Liu, Chi
    Li, Heng-Chao
    Liao, Wenzhi
    Philips, Wilfried
    Emery, William J.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (08) : 4145 - 4160
  • [45] Structured Optimal Variational Inference for Dynamic Latent Space Models
    Zhao, Peng
    Bhattacharya, Anirban
    Pati, Debdeep
    Mallick, Bani K.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 55
  • [46] Stick-Breaking Dependent Beta Processes with Variational Inference
    Zehui Cao
    Jing Zhao
    Shiliang Sun
    Neural Processing Letters, 2021, 53 : 339 - 353
  • [47] Stick-Breaking Dependent Beta Processes with Variational Inference
    Cao, Zehui
    Zhao, Jing
    Sun, Shiliang
    NEURAL PROCESSING LETTERS, 2021, 53 (01) : 339 - 353
  • [48] Practical Bayesian inference using mixtures of mixtures
    Cao, GL
    West, M
    BIOMETRICS, 1996, 52 (04) : 1334 - 1341
  • [49] Combined Belief Propagation-Mean Field Message Passing Algorithm for Dirichlet Process Mixtures
    Lu, Xinhua
    Zhang, Chuanzong
    Wang, Zhongyong
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (07) : 1041 - 1045
  • [50] Gradient Regularization as Approximate Variational Inference
    Unlu, Ali
    Aitchison, Laurence
    ENTROPY, 2021, 23 (12)