Evaluation of the Dirichlet Process Multinomial Mixture Model for Short-Text Topic Modeling

被引:1
作者
Karlsson, Alexander [1 ]
Duarte, Denio [2 ]
Mathiason, Gunnar [1 ]
Bae, Juhee [1 ]
机构
[1] Univ Skovde, Sch Informat, Skovde, Sweden
[2] Fed Univ Fronteira Sul, Campus Chapeco, Chapeco, Brazil
来源
2018 6TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL AND BUSINESS INTELLIGENCE (ISCBI 2018) | 2018年
关键词
text analysis; topic modeling; Bayesian non-parametrics; Dirichlet process; short text;
D O I
10.1109/ISCBI.2018.00025
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Fast-moving trends, both in society and in highly competitive business areas, call for effective methods for automatic analysis. The availability of fast-moving sources in the form of short texts, such as social media and blogs, allows aggregation from a vast number of text sources, for an up to date view of trends and business insights. Topic modeling is established as an approach for analysis of large amounts of texts, but the scarcity of statistical information in short texts is considered to be a major problem for obtaining reliable topics from traditional models such as LDA. A range of different specialized topic models have been proposed, but a majority of these approaches rely on rather strong parametric assumptions, such as setting a fixed number of topics. In contrast, recent advances in the field of Bayesian non-parametrics suggest the Dirichlet process as a method that, given certain hyper-parameters, can self-adapt to the number of topics of the data at hand. We perform an empirical evaluation of the Dirichlet process multinomial (unigram) mixture model against several parametric topic models, initialized with different number of topics. The resulting models are evaluated, using both direct and indirect measures that have been found to correlate well with human topic rankings. We show that the Dirichlet Process Multinomial Mixture model is a viable option for short text topic modeling since it on average performs better, or nearly as good, compared to the parametric alternatives, while reducing parameter setting requirements and thereby eliminates the need of expensive preprocessing.
引用
收藏
页码:79 / 83
页数:5
相关论文
共 15 条
  • [1] [Anonymous], 2010, DIRICHLET PROCESS
  • [2] MIXTURES OF DIRICHLET PROCESSES WITH APPLICATIONS TO BAYESIAN NONPARAMETRIC PROBLEMS
    ANTONIAK, CE
    [J]. ANNALS OF STATISTICS, 1974, 2 (06) : 1152 - 1174
  • [3] Latent Dirichlet allocation
    Blei, DM
    Ng, AY
    Jordan, MI
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) : 993 - 1022
  • [4] BTM: Topic Modeling over Short Texts
    Cheng, Xueqi
    Yan, Xiaohui
    Lan, Yanyan
    Guo, Jiafeng
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2014, 26 (12) : 2928 - 2941
  • [5] BAYESIAN ANALYSIS OF SOME NONPARAMETRIC PROBLEMS
    FERGUSON, TS
    [J]. ANNALS OF STATISTICS, 1973, 1 (02) : 209 - 230
  • [6] Imai Y., 2016, MIXTURE TOPIC MODELS
  • [7] Markov chain sampling methods for Dirichlet process mixture
    Neal, RM
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2000, 9 (02) : 249 - 265
  • [8] Newman D., 2010, P 2010 JOINT INT C D, DOI DOI 10.1145/1816123.1816156
  • [9] Nguyen D.Q., 2015, jLDADMM: A Java package for the LDA and DMM topic models
  • [10] Pan YL, 2014, ADV INTEL SYS RES, V109, P301