On a Dirichlet Process Mixture Representation of Phase-Type Distributions

被引:1
作者
Ayala, Daniel [1 ]
Jofre, Leonardo [1 ]
Gutierrez, Luis [2 ]
Mena, Ramses H. [3 ]
机构
[1] Pontificia Univ Catolica Chile, Dept Estadist, Santiago, Region Metropol, Chile
[2] Pontificia Univ Catolica Chile, Dept Estadist, ANID Millennium Sci Initiat Program, Millennium Nucleus Ctr Discovery Struct Complex D, Santiago, Region Metropol, Chile
[3] IIMAS UNAM, Mexico City, DF, Mexico
来源
BAYESIAN ANALYSIS | 2022年 / 17卷 / 03期
关键词
Bayesian nonparametrics; Erlang distribution; mixture model; renewal function; SAMPLING METHODS; MODELS;
D O I
10.1214/21-BA1272
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
An explicit representation of phase-type distributions as an infinite mixture of Erlang distributions is introduced. The representation unveils a novel and useful connection between a class of Bayesian nonparametric mixture mod-els and phase-type distributions. In particular, this sheds some light on two hot topics, estimation techniques for phase-type distributions, and the availability of closed-form expressions for some functionals related to Dirichlet process mixture models. The power of this connection is illustrated via a posterior inference al-gorithm to estimate phase-type distributions, avoiding some difficulties with the simulation of latent Markov jump processes, commonly encountered in phase-type Bayesian inference. On the other hand, closed-form expressions for functionals of Dirichlet process mixture models are illustrated with density and renewal function estimation, related to the optimal salmon weight distribution of an aquaculture study.
引用
收藏
页码:765 / 790
页数:26
相关论文
共 50 条
[31]   A Dirichlet Process Mixture Model for Non-Ignorable Dropout [J].
Moore, Camille M. ;
Carlson, Nichole E. ;
MaWhinney, Samantha ;
Kreidler, Sarah .
BAYESIAN ANALYSIS, 2020, 15 (04) :1139-1167
[32]   A Sequential Algorithm for Fast Fitting of Dirichlet Process Mixture Models [J].
Zhang, Xiaole ;
Nott, David J. ;
Yau, Christopher ;
Jasra, Ajay .
JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2014, 23 (04) :1143-1162
[33]   Dirichlet Process Gaussian Mixture Models:Choice of the Base Distribution [J].
Dilan Grür ;
Carl Edward Rasmussen .
JournalofComputerScience&Technology, 2010, 25 (04) :653-664
[34]   Are Gibbs-Type Priors the Most Natural Generalization of the Dirichlet Process? [J].
De Blasi, Pierpaolo ;
Favaro, Stefano ;
Lijoi, Antonio ;
Mena, Ramses H. ;
Prunster, Igor ;
Ruggiero, Matteo .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (02) :212-229
[35]   Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent [J].
Lim, Kart-Leong .
ICPRAM: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2020, :33-42
[36]   Quantum annealing for Dirichlet process mixture models with applications to network clustering [J].
Sato, Issei ;
Tanaka, Shu ;
Kurihara, Kenichi ;
Miyashita, Seiji ;
Nakagawa, Hiroshi .
NEUROCOMPUTING, 2013, 121 :523-531
[37]   Estimating a semiparametric asymmetric stochastic volatility model with a Dirichlet process mixture [J].
Jensen, Mark J. ;
Maheu, John M. .
JOURNAL OF ECONOMETRICS, 2014, 178 :523-538
[38]   Variance Matrix Priors for Dirichlet Process Mixture Models With Gaussian Kernels [J].
Jing, Wei ;
Papathomas, Michail ;
Liverani, Silvia .
INTERNATIONAL STATISTICAL REVIEW, 2024,
[39]   Dual-semiparametric regression using weighted Dirichlet process mixture [J].
Sun, Peng ;
Kim, Inyoung ;
Lee, Ki-Ahm .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2018, 117 :162-181
[40]   A DIRICHLET PROCESS MIXTURE OF HIDDEN MARKOV MODELS FOR PROTEIN STRUCTURE PREDICTION [J].
Lennox, Kristin P. ;
Dahl, David B. ;
Vannucci, Marina ;
Day, Ryan ;
Tsai, Jerry W. .
ANNALS OF APPLIED STATISTICS, 2010, 4 (02) :916-942