Blocked Gibbs Sampler for Hierarchical Dirichlet Processes

被引:0
|
作者
Das, Snigdha [1 ]
Niu, Yabo [2 ]
Ni, Yang [1 ]
Mallick, Bani K. [1 ]
Pati, Debdeep [1 ]
机构
[1] Texas A&M Univ, Dept Stat, College Stn, TX 77843 USA
[2] Univ Houston, Dept Math, Houston, TX USA
关键词
Fast mixing; Normalized random measure; Slice sampling;
D O I
10.1080/10618600.2024.2388543
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Posterior computation in hierarchical Dirichlet process (HDP) mixture models is an active area of research in nonparametric Bayes inference of grouped data. Existing literature almost exclusively focuses on the Chinese restaurant franchise (CRF) analogy of the marginal distribution of the parameters, which can mix poorly and has a quadratic complexity with the sample size. A recently developed slice sampler allows for efficient blocked updates of the parameters, but is shown to be statistically unstable in our article. We develop a blocked Gibbs sampler that employs a truncated approximation of the underlying random measures to sample from the posterior distribution of HDP, which produces statistically stable results, is highly scalable with respect to sample size, and is shown to have good mixing. The heart of the construction is to endow the shared concentration parameter with an appropriately chosen gamma prior that allows us to break the dependence of the shared mixing proportions and permits independent updates of certain log-concave random variables in a block. En route, we develop an efficient rejection sampler for these random variables leveraging piece-wise tangent-line approximations. Supplementary materials, which include substantive additional details and code, are available online.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Parallel Gibbs Sampling for Hierarchical Dirichlet Processes via Gamma Processes Equivalence
    Cheng, Dehua
    Liu, Yan
    PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, : 562 - 571
  • [2] Convergence rates for a hierarchical Gibbs sampler
    Jovanovski, Oliver
    Madras, Neal
    BERNOULLI, 2017, 23 (01) : 603 - 625
  • [3] Stability of the Gibbs sampler for Bayesian hierarchical models
    Papaspiliopoulos, Omiros
    Roberts, Gareth
    ANNALS OF STATISTICS, 2008, 36 (01): : 95 - 117
  • [4] Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
    Agapiou, Sergios
    Bardsley, Johnathan M.
    Papaspiliopoulos, Omiros
    Stuart, Andrew M.
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2014, 2 (01): : 511 - 544
  • [5] An improved collapsed Gibbs sampler for Dirichlet process mixing models
    Kuo, L
    Yang, TY
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2006, 50 (03) : 659 - 674
  • [6] Hierarchical Dirichlet processes
    Teh, Yee Whye
    Jordan, Michael I.
    Beal, Matthew J.
    Blei, David M.
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2006, 101 (476) : 1566 - 1581
  • [7] Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric
    Wang, Neng-Yi
    Yin, Guosheng
    STOCHASTICS-AN INTERNATIONAL JOURNAL OF PROBABILITY AND STOCHASTIC PROCESSES, 2020, 92 (02) : 265 - 274
  • [8] Partially collapsed parallel Gibbs sampler for Dirichlet process mixture models
    Yerebakan, Halid Ziya
    Dundar, Murat
    PATTERN RECOGNITION LETTERS, 2017, 90 : 22 - 27
  • [9] A blocked Gibbs sampler for NGG-mixture models via a priori truncation
    Argiento, Raffaele
    Bianchini, Ilaria
    Guglielmi, Alessandra
    STATISTICS AND COMPUTING, 2016, 26 (03) : 641 - 661
  • [10] A blocked Gibbs sampler for NGG-mixture models via a priori truncation
    Raffaele Argiento
    Ilaria Bianchini
    Alessandra Guglielmi
    Statistics and Computing, 2016, 26 : 641 - 661