On a Dirichlet Process Mixture Representation of Phase-Type Distributions

被引:1
作者
Ayala, Daniel [1 ]
Jofre, Leonardo [1 ]
Gutierrez, Luis [2 ]
Mena, Ramses H. [3 ]
机构
[1] Pontificia Univ Catolica Chile, Dept Estadist, Santiago, Region Metropol, Chile
[2] Pontificia Univ Catolica Chile, Dept Estadist, ANID Millennium Sci Initiat Program, Millennium Nucleus Ctr Discovery Struct Complex D, Santiago, Region Metropol, Chile
[3] IIMAS UNAM, Mexico City, DF, Mexico
来源
BAYESIAN ANALYSIS | 2022年 / 17卷 / 03期
关键词
Bayesian nonparametrics; Erlang distribution; mixture model; renewal function; SAMPLING METHODS; MODELS;
D O I
10.1214/21-BA1272
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
An explicit representation of phase-type distributions as an infinite mixture of Erlang distributions is introduced. The representation unveils a novel and useful connection between a class of Bayesian nonparametric mixture mod-els and phase-type distributions. In particular, this sheds some light on two hot topics, estimation techniques for phase-type distributions, and the availability of closed-form expressions for some functionals related to Dirichlet process mixture models. The power of this connection is illustrated via a posterior inference al-gorithm to estimate phase-type distributions, avoiding some difficulties with the simulation of latent Markov jump processes, commonly encountered in phase-type Bayesian inference. On the other hand, closed-form expressions for functionals of Dirichlet process mixture models are illustrated with density and renewal function estimation, related to the optimal salmon weight distribution of an aquaculture study.
引用
收藏
页码:765 / 790
页数:26
相关论文
共 50 条
  • [21] Nonparametric empirical Bayes for the Dirichlet process mixture model
    McAuliffe, JD
    Blei, DM
    Jordan, MI
    STATISTICS AND COMPUTING, 2006, 16 (01) : 5 - 14
  • [22] Nonparametric empirical Bayes for the Dirichlet process mixture model
    Jon D. McAuliffe
    David M. Blei
    Michael I. Jordan
    Statistics and Computing, 2006, 16 : 5 - 14
  • [23] Variational learning of a Dirichlet process of generalized Dirichlet distributions for simultaneous clustering and feature selection
    Fan, Wentao
    Bouguila, Nizar
    PATTERN RECOGNITION, 2013, 46 (10) : 2754 - 2769
  • [24] Selecting the precision parameter prior in Dirichlet process mixture models
    Murugiah, Siva
    Sweeting, Trevor
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2012, 142 (07) : 1947 - 1959
  • [25] A Dirichlet Process Mixture Model for Non-Ignorable Dropout
    Moore, Camille M.
    Carlson, Nichole E.
    MaWhinney, Samantha
    Kreidler, Sarah
    BAYESIAN ANALYSIS, 2020, 15 (04): : 1139 - 1167
  • [26] On selecting a prior for the precision parameter of Dirichlet process mixture models
    Dorazio, Robert M.
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2009, 139 (09) : 3384 - 3390
  • [27] Dirichlet Process Mixture of Mixtures Model for Unsupervised Subword Modeling
    Heck, Michael
    Sakti, Sakriani
    Nakamura, Satoshi
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (11) : 2027 - 2042
  • [28] A Sequential Algorithm for Fast Fitting of Dirichlet Process Mixture Models
    Zhang, Xiaole
    Nott, David J.
    Yau, Christopher
    Jasra, Ajay
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2014, 23 (04) : 1143 - 1162
  • [29] Dirichlet Process Gaussian Mixture Models:Choice of the Base Distribution
    Dilan Grür
    Carl Edward Rasmussen
    JournalofComputerScience&Technology, 2010, 25 (04) : 653 - 664
  • [30] Dirichlet process mixture models for the analysis of repeated attempt designs
    Daniels, Michael J.
    Lee, Minji
    Feng, Wei
    BIOMETRICS, 2023, 79 (04) : 3907 - 3915