Asynchronous distributed estimation of topic models for document analysis

被引:8
作者
Asuncion, Arthur U. [1 ]
Smyth, Padhraic [1 ]
Welling, Max [1 ]
机构
[1] Univ Calif Irvine, Dept Comp Sci, Irvine, CA 92717 USA
关键词
Topic model; Distributed learning; Parallelization; Gibbs sampling;
D O I
10.1016/j.stamet.2010.03.002
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Given the prevalence of large data sets and the availability of inexpensive parallel computing hardware, there is significant motivation to explore distributed implementations of statistical learning algorithms. In this paper, we present a distributed learning framework for Latent Dirichlet Allocation (LDA), a well-known Bayesian latent variable model for sparse matrices of count data. In the proposed approach, data are distributed across P processors, and processors independently perform inference on their local data and communicate their sufficient statistics in a local asynchronous manner with other processors. We apply two different approximate inference techniques for LDA, collapsed Gibbs sampling and collapsed variational inference, within a distributed framework. The results show significant improvements in computation time and memory when running the algorithms on very large text corpora using parallel hardware. Despite the approximate nature of the proposed approach, simulations suggest that asynchronous distributed algorithms are able to learn models that are nearly as accurate as those learned by the standard non-distributed approaches. We also find that our distributed algorithms converge rapidly to good solutions. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:3 / 17
页数:15
相关论文
共 50 条
[21]   A generalized topic modeling approach for automatic document annotation [J].
Tuarob, Suppawong ;
Pouchard, Line C. ;
Mitra, Prasenjit ;
Giles, C. Lee .
INTERNATIONAL JOURNAL ON DIGITAL LIBRARIES, 2015, 16 (02) :111-128
[22]   Deep Unfolding for Topic Models [J].
Chien, Jen-Tzung ;
Lee, Chao-Hsi .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (02) :318-331
[23]   Topic Models for Comparative Summarization [J].
Campr, Michal ;
Jezek, Karel .
TEXT, SPEECH, AND DIALOGUE, TSD 2013, 2013, 8082 :568-574
[24]   A Survey On Interactivity in Topic Models [J].
Kjellin, Patrik Ehrencrona ;
Liu, Yan .
INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2016, 7 (04) :456-461
[25]   Topic-Sensitive Multi-document Summarization Algorithm [J].
Liu Na ;
Tang Xiao-jun ;
Lu Ying ;
Li Ming-xia ;
Wang Hai-wen ;
Xiao Peng .
2014 SIXTH INTERNATIONAL SYMPOSIUM ON PARALLEL ARCHITECTURES, ALGORITHMS AND PROGRAMMING (PAAP), 2014, :69-74
[26]   Labelset topic model for multi-label document classification [J].
Li, Ximing ;
Ouyang, Jihong ;
Zhou, Xiaotang .
JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2016, 46 (01) :83-97
[27]   Topic-Sensitive Multi-document Summarization Algorithm [J].
Liu Na ;
Di Tang ;
Lu Ying ;
Tang Xiao-jun ;
Wang Hai-wen .
COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2015, 12 (04) :1375-1389
[28]   Document semantic compression algorithm based on phrase topic model [J].
Wang, Lidong ;
Zhang, Yin ;
Lü, Mingqi .
Xinan Jiaotong Daxue Xuebao/Journal of Southwest Jiaotong University, 2015, 50 (04) :755-763
[29]   Seed-Guided Topic Model for Document Filtering and Classification [J].
Li, Chenliang ;
Chen, Shiqian ;
Xing, Jian ;
Sun, Aixin ;
Ma, Zongyang .
ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2019, 37 (01)
[30]   Labelset topic model for multi-label document classification [J].
Ximing Li ;
Jihong Ouyang ;
Xiaotang Zhou .
Journal of Intelligent Information Systems, 2016, 46 :83-97