The Pitman-Yor multinomial process for mixture modelling

被引:17
|
作者
Lijoi, Antonio [1 ]
Prunster, Igor [1 ]
Rigon, Tommaso [1 ]
机构
[1] Bocconi Univ, Dept Decis Sci, Via Rontgen 1, I-20136 Milan, Italy
关键词
Bayesian nonparametric inference; Convex mixture regression; Exchangeable random partition; Pitman-Yor process; Ratio-stable distribution; Species sampling model; DIRICHLET; NUMBER;
D O I
10.1093/biomet/asaa030
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Discrete nonparametric priors play a central role in a variety of Bayesian procedures, most notably when used to model latent features, such as in clustering, mixtures and curve fitting. They are effective and well-developed tools, though their infinite dimensionality is unsuited to some applications. If one restricts to a finite-dimensional simplex, very little is known beyond the traditional Dirichlet multinomial process, which is mainly motivated by conjugacy. This paper introduces an alternative based on the Pitman-Yor process, which provides greater flexibility while preserving analytical tractability. Urn schemes and posterior characterizations are obtained in closed form, leading to exact sampling methods. In addition, the proposed approach can be used to accurately approximate the infinite-dimensional Pitman-Yor process, yielding improvements over existing truncation-based approaches. An application to convex mixture regression for quantitative risk assessment illustrates the theoretical results and compares our approach with existing methods.
引用
收藏
页码:891 / 906
页数:16
相关论文
共 50 条
  • [21] Spatial emission tomography reconstruction using Pitman-Yor process
    Fall, Mame Diarra
    Barat, Eric
    Mohammad-Djafari, Ali
    Comtat, Claude
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2009, 1193 : 194 - +
  • [22] Pitman-Yor process mixture model for community structure exploration considering latent interaction patterns*
    Wang, Jing
    Li, Kan
    CHINESE PHYSICS B, 2021, 30 (12)
  • [23] Online Learning of Hierarchical Pitman-Yor Process Mixture of Generalized Dirichlet Distributions With Feature Selection
    Fan, Wentao
    Sallay, Hassen
    Bouguila, Nizar
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (09) : 2048 - 2061
  • [24] BAYESIAN COMMON SPATIAL PATTERNS WITH PITMAN-YOR PROCESS PRIORS
    Kang, Hyohyeong
    Choi, Seungjin
    2012 PROCEEDINGS OF THE 20TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2012, : 684 - 688
  • [25] A Parallel Training Algorithm for Hierarchical Pitman-Yor Process Language Models
    Huang, Songfang
    Renals, Steve
    INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5, 2009, : 2663 - 2666
  • [26] Bernstein-von Mises theorem for the Pitman-Yor process of nonnegative
    Franssen, S. E. M. P.
    van der Vaart, A. W.
    ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (02): : 5779 - 5811
  • [27] Limits of renewal processes and Pitman-Yor distribution
    Basrak, Bojan
    ELECTRONIC COMMUNICATIONS IN PROBABILITY, 2015, 20
  • [28] Importance conditional sampling for Pitman-Yor mixtures
    Canale, Antonio
    Corradin, Riccardo
    Nipoti, Bernardo
    STATISTICS AND COMPUTING, 2022, 32 (03)
  • [29] Hierarchical Pitman-Yor language models for ASR in meetings
    Huang, Songfang
    Renals, Steve
    2007 IEEE WORKSHOP ON AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING, VOLS 1 AND 2, 2007, : 124 - 129
  • [30] Hierarchical Pitman-Yor Language Model for Information Retrieval
    Momtazi, Saeedeh
    Klakow, Dietrich
    SIGIR 2010: PROCEEDINGS OF THE 33RD ANNUAL INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH DEVELOPMENT IN INFORMATION RETRIEVAL, 2010, : 793 - 794