Selecting the precision parameter prior in Dirichlet process mixture models

被引:13
作者
Murugiah, Siva [1 ]
Sweeting, Trevor [1 ]
机构
[1] UCL, Dept Stat Sci, London WC1E 6BT, England
关键词
Bayesian nonparametrics; Dirichlet process; Empirical Bayes; Mixture models;
D O I
10.1016/j.jspi.2012.02.013
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider Dirichlet process mixture models in which the observed clusters in any particular dataset are not viewed as belonging to a finite set of possible clusters but rather as representatives of a latent structure in which objects belong to one of a potentially infinite number of clusters. As more information is revealed the number of inferred clusters is allowed to grow. The precision parameter of the Dirichlet process is a crucial parameter that controls the number of clusters. We develop a framework for the specification of the hyperparameters associated with the prior for the precision parameter that can be used both in the presence or absence of subjective prior information about the level of clustering. Our approach is illustrated in an analysis of clustering brands at the magazine Which?. The results are compared with the approach of Dorazio (2009) via a simulation study. (C) 2012 Elsevier B.V. All rights reserved.
引用
收藏
页码:1947 / 1959
页数:13
相关论文
共 50 条
  • [41] Online Variational Learning for a Dirichlet Process Mixture of Dirichlet Distributions and Its Application
    Fan, Wentao
    Bouguila, Nizar
    2012 11TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2012), VOL 1, 2012, : 362 - 367
  • [42] Bayesian nonparametric analysis for a generalized Dirichlet process prior
    Lijoi A.
    Mena R.H.
    Prünster I.
    Statistical Inference for Stochastic Processes, 2005, 8 (3) : 283 - 309
  • [43] Dirichlet compound negative multinomial mixture models and applications
    Bregu, Ornela
    Bouguila, Nizar
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2024,
  • [44] Modeling unobserved sources of heterogeneity in animal abundance using a Dirichlet process prior
    Dorazio, Robert M.
    Mukherjee, Bhramar
    Zhang, Li
    Ghosh, Malay
    Jelks, Howard L.
    Jordan, Frank
    BIOMETRICS, 2008, 64 (02) : 635 - 644
  • [45] A Bayesian model for supervised clustering with the dirichlet process prior
    Daume, H
    Marcu, D
    JOURNAL OF MACHINE LEARNING RESEARCH, 2005, 6 : 1551 - 1577
  • [46] Truncated Poisson–Dirichlet approximation for Dirichlet process hierarchical models
    Junyi Zhang
    Angelos Dassios
    Statistics and Computing, 2023, 33
  • [47] Variational Learning for Finite Dirichlet Mixture Models and Applications
    Fan, Wentao
    Bouguila, Nizar
    Ziou, Djemel
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (05) : 762 - 774
  • [48] Distributionally Robust Edge Learning with Dirichlet Process Prior
    Zhang, Zhaofeng
    Chen, Yue
    Zhang, Junshan
    2020 IEEE 40TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS), 2020, : 798 - 808
  • [49] Application of Dirichlet mixture of normals in growth curve models
    Huang, Steward
    SenGupta, Ashis
    Lii, Keh-Shin
    STATISTICAL METHODOLOGY, 2011, 8 (05) : 434 - 441
  • [50] Mean field inference for the Dirichlet process mixture model
    Zobay, O.
    ELECTRONIC JOURNAL OF STATISTICS, 2009, 3 : 507 - 545