Learning Topic Models by Belief Propagation

被引:34
|
作者
Zeng, Jia [1 ]
Cheung, William K. [2 ]
Liu, Jiming [2 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou 215006, Peoples R China
[2] Hong Kong Baptist Univ, Dept Comp Sci, Kowloon Tong, Hong Kong, Peoples R China
关键词
Latent Dirichlet allocation; topic models; belief propagation; message passing; factor graph; Bayesian networks; Markov random fields; hierarchical Bayesian models; Gibbs sampling; variational Bayes; EM;
D O I
10.1109/TPAMI.2012.185
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Latent Dirichlet allocation (LDA) is an important hierarchical Bayesian model for probabilistic topic modeling, which attracts worldwide interest and touches on many important applications in text mining, computer vision and computational biology. This paper represents the collapsed LDA as a factor graph, which enables the classic loopy belief propagation (BP) algorithm for approximate inference and parameter estimation. Although two commonly used approximate inference methods, such as variational Bayes (VB) and collapsed Gibbs sampling (GS), have gained great success in learning LDA, the proposed BP is competitive in both speed and accuracy, as validated by encouraging experimental results on four large-scale document datasets. Furthermore, the BP algorithm has the potential to become a generic scheme for learning variants of LDA-based topic models in the collapsed space. To this end, we show how to learn two typical variants of LDA-based topic models, such as author-topic models (ATM) and relational topic models (RTM), using BP based on the factor graph representations.
引用
收藏
页码:1121 / 1134
页数:14
相关论文
共 50 条
  • [21] Multirelational Topic Models
    Zeng, Jia
    Cheung, William K.
    Li, Chun-hung
    Liu, Jiming
    2009 9TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, 2009, : 1070 - 1075
  • [22] Efficient Belief Propagation for Early Vision
    Pedro F. Felzenszwalb
    Daniel P. Huttenlocher
    International Journal of Computer Vision, 2006, 70 : 41 - 54
  • [23] BACKGROUND SUBTRACTION USING BELIEF PROPAGATION
    Hahn, Hee-il
    ICINCO 2011: PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, VOL 2, 2011, : 281 - 286
  • [24] On Convergence Conditions of Gaussian Belief Propagation
    Su, Qinliang
    Wu, Yik-Chung
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (05) : 1144 - 1155
  • [25] Efficient belief propagation for early vision
    Felzenszwalb, Pedro F.
    Huttenlocher, Daniel P.
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2006, 70 (01) : 41 - 54
  • [26] SpectralLeader: Online Spectral Learning for Single Topic Models
    Yu, Tong
    Kveton, Branislav
    Wen, Zheng
    Bui, Hung
    Mengshoel, Ole J.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT II, 2019, 11052 : 379 - 395
  • [27] Scalable Multitarget Tracking Using Multiple Sensors: A Belief Propagation Approach
    Meyer, Florian
    Braca, Paolo
    Willett, Peter
    Hlawatsch, Franz
    2015 18TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2015, : 1778 - 1785
  • [28] BELIEF PROPAGATION AND LEARNING IN CONVOLUTION MULTI-LAYER FACTOR GRAPHS
    Palmieri, Francesco A. N.
    Buonanno, Amedeo
    2014 4TH INTERNATIONAL WORKSHOP ON COGNITIVE INFORMATION PROCESSING (CIP), 2014,
  • [29] Learning multiple belief propagation fixed points for real time inference
    Furtlehner, Cyril
    Lasgouttes, Jean-Marc
    Auger, Anne
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2010, 389 (01) : 149 - 163
  • [30] Deep learning via message passing algorithms based on belief propagation
    Lucibello, Carlo
    Pittorino, Fabrizio
    Perugini, Gabriele
    Zecchina, Riccardo
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2022, 3 (03):