Variational Bayesian inference for a Dirichlet process mixture of beta distributions and application

被引:7
|
作者
Lai, Yuping [1 ]
Ping, Yuan [2 ]
Xiao, Ke [1 ]
Hao, Bin [3 ]
Zhang, Xiufeng [4 ]
机构
[1] North China Univ Technol, Coll Comp Sci & Technol, Beijing, Peoples R China
[2] Xuchang Univ, Sch Informat Engn, Xuchang, Peoples R China
[3] Chinese Univ Hong Kong, Inst Network Coding, Shatin, Hong Kong, Peoples R China
[4] Natl Res Ctr Rehabil Tech Aids, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Dirichlet process; Nonparametric Bayesian analysis; Beta distribution; Infinite mixture model; Variational inference; Image categorization; Object detection; HIDDEN MARKOV MODEL; INFORMATION CRITERION; CLASSIFICATION;
D O I
10.1016/j.neucom.2017.07.068
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Finite beta mixture model (BMM) has been shown to be very flexible and powerful for bounded support data modeling. However, BMM cannot automatically select the proper number of the mixture components based on the observed data, which is important and has a deterministic effect on the modeling accuracy. In this paper, we aim at tackling this problem by infinite Beta mixture model (InBMM). It is based on the Dirichlet process (DP) mixture with the assumption that the number of the mixture components is infinite in advance and can be automatically determined according to the observed data. Further, a variational InBMM using single lower- bound approximation (VBInBMM) is proposed which applies the stick-breaking representation of the DP and is learned by an extended variational inference framework. Numerical experiments on both synthetic and real data, generated from two challenging application namely image categorization and object detection, demonstrate good performance obtained by the proposed method. (C) 2017 Published by Elsevier B. V.
引用
收藏
页码:23 / 33
页数:11
相关论文
共 50 条
  • [21] Mean field inference for the Dirichlet process mixture model
    Zobay, O.
    ELECTRONIC JOURNAL OF STATISTICS, 2009, 3 : 507 - 545
  • [22] Dirichlet process mixture of Gaussian process functional regressions and its variational EM algorithm
    Li, Tao
    Ma, Jinwen
    PATTERN RECOGNITION, 2022, 134
  • [23] A Dirichlet Process Mixture of Generalized Dirichlet Distributions for Proportional Data Modeling
    Bouguila, Nizar
    Ziou, Djemel
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (01): : 107 - 122
  • [24] Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference
    Chen, Peng
    Zabaras, Nicholas
    Bilionis, Ilias
    JOURNAL OF COMPUTATIONAL PHYSICS, 2015, 284 : 291 - 333
  • [25] Community Embeddings with Bayesian Gaussian Mixture Model and Variational Inference
    Begehr, Anton I. N.
    Panfilov, Peter B.
    2022 IEEE 24TH CONFERENCE ON BUSINESS INFORMATICS (CBI 2022), VOL 2, 2022, : 88 - 96
  • [26] Bayesian Inference in Infinite Multivariate McDonald's Beta Mixture Model
    Forouzanfar, Darya
    Manouchehri, Narges
    Bouguila, Nizar
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 320 - 330
  • [27] Online variational inference on finite multivariate Beta mixture models for medical applications
    Manouchehri, Narges
    Kalra, Meeta
    Bouguila, Nizar
    IET IMAGE PROCESSING, 2021, 15 (09) : 1869 - 1882
  • [28] Variational learning of a Dirichlet process of generalized Dirichlet distributions for simultaneous clustering and feature selection
    Fan, Wentao
    Bouguila, Nizar
    PATTERN RECOGNITION, 2013, 46 (10) : 2754 - 2769
  • [29] Expectation propagation learning of a Dirichlet process mixture of Beta-Liouville distributions for proportional data clustering
    Fan, Wentao
    Bouguila, Nizar
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2015, 43 : 1 - 14
  • [30] Axially Symmetric Data Clustering Through Dirichlet Process Mixture Models of Watson Distributions
    Fan, Wentao
    Bouguila, Nizar
    Du, Ji-Xiang
    Liu, Xin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (06) : 1683 - 1694