Variance Matrix Priors for Dirichlet Process Mixture Models With Gaussian Kernels

被引:1
|
作者
Jing, Wei [1 ]
Papathomas, Michail [1 ]
Liverani, Silvia [2 ,3 ]
机构
[1] Univ St Andrews, Sch Math & Stat, St Andrews, Scotland
[2] Queen Mary Univ London, Sch Math Sci, London, England
[3] Alan Turing Inst, British Lib, London, England
关键词
Bayesian non-parametrics; clustering; BAYESIAN VARIABLE SELECTION; PRIOR DISTRIBUTIONS; PROFILE REGRESSION; NUMBER; LASSO;
D O I
10.1111/insr.12595
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Bayesian mixture modelling is widely used for density estimation and clustering. The Dirichlet process mixture model (DPMM) is the most popular Bayesian non-parametric mixture modelling approach. In this manuscript, we study the choice of prior for the variance or precision matrix when Gaussian kernels are adopted. Typically, in the relevant literature, the assessment of mixture models is done by considering observations in a space of only a handful of dimensions. Instead, we are concerned with more realistic problems of higher dimensionality, in a space of up to 20 dimensions. We observe that the choice of prior is increasingly important as the dimensionality of the problem increases. After identifying certain undesirable properties of standard priors in problems of higher dimensionality, we review and implement possible alternative priors. The most promising priors are identified, as well as other factors that affect the convergence of MCMC samplers. Our results show that the choice of prior is critical for deriving reliable posterior inferences. This manuscript offers a thorough overview and comparative investigation into possible priors, with detailed guidelines for their implementation. Although our work focuses on the use of the DPMM in clustering, it is also applicable to density estimation.
引用
收藏
页数:25
相关论文
共 50 条
  • [21] A comparative review of variable selection techniques for covariate dependent Dirichlet process mixture models
    Barcella, William
    De Iorio, Maria
    Baio, Gianluca
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2017, 45 (03): : 254 - 273
  • [22] Power-Expected-Posterior Priors for Variable Selection in Gaussian Linear Models
    Fouskakis, Dimitris
    Ntzoufras, Ioannis
    Draper, David
    BAYESIAN ANALYSIS, 2015, 10 (01): : 75 - 107
  • [23] Are Gibbs-Type Priors the Most Natural Generalization of the Dirichlet Process?
    De Blasi, Pierpaolo
    Favaro, Stefano
    Lijoi, Antonio
    Mena, Ramses H.
    Prunster, Igor
    Ruggiero, Matteo
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (02) : 212 - 229
  • [24] Quantum annealing for Dirichlet process mixture models with applications to network clustering
    Sato, Issei
    Tanaka, Shu
    Kurihara, Kenichi
    Miyashita, Seiji
    Nakagawa, Hiroshi
    NEUROCOMPUTING, 2013, 121 : 523 - 531
  • [25] Quantum annealing for Dirichlet process mixture models with applications to network clustering
    Sato, Issei
    Tanaka, Shu
    Kurihara, Kenichi
    Miyashita, Seiji
    Nakagawa, Hiroshi
    Neurocomputing, 2013, 121 : 523 - 531
  • [26] A DIRICHLET PROCESS MIXTURE OF HIDDEN MARKOV MODELS FOR PROTEIN STRUCTURE PREDICTION
    Lennox, Kristin P.
    Dahl, David B.
    Vannucci, Marina
    Day, Ryan
    Tsai, Jerry W.
    ANNALS OF APPLIED STATISTICS, 2010, 4 (02) : 916 - 942
  • [27] Partially collapsed parallel Gibbs sampler for Dirichlet process mixture models
    Yerebakan, Halid Ziya
    Dundar, Murat
    PATTERN RECOGNITION LETTERS, 2017, 90 : 22 - 27
  • [28] CONTEXT-AWARE PREFERENCE LEARNING SYSTEM BASED ON DIRICHLET PROCESS GAUSSIAN MIXTURE MODEL
    Xu, Xianbo
    van Erp, Bart
    Ignatenko, Tanya
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 6805 - 6809
  • [29] Dirichlet process mixture models for unsupervised clustering of symptoms in Parkinson's disease
    White, Nicole
    Johnson, Helen
    Silburn, Peter
    Mengersen, Kerrie
    JOURNAL OF APPLIED STATISTICS, 2012, 39 (11) : 2363 - 2377
  • [30] Hierarchical mixture modeling with normalized inverse-Gaussian priors
    Lijoi, A
    Mena, RH
    Prünster, I
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2005, 100 (472) : 1278 - 1291