Full-Text or Abstract? Examining Topic Coherence Scores Using Latent Dirichlet Allocation

被引:246
作者
Syed, Shaheen [1 ]
Spruit, Marco [1 ]
机构
[1] Univ Utrecht, Dept Informat & Comp Sci, Utrecht, Netherlands
来源
2017 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA) | 2017年
基金
欧盟地平线“2020”;
关键词
SCIENCE; TRENDS; MODEL;
D O I
10.1109/DSAA.2017.61
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper assesses topic coherence and human topic ranking of uncovered latent topics from scientific publications when utilizing the topic model latent Dirichlet allocation (LDA) on abstract and full-text data. The coherence of a topic, used as a proxy for topic quality, is based on the distributional hypothesis that states that words with similar meaning tend to co-occur within a similar context. Although LDA has gained much attention from machine-learning researchers, most notably with its adaptations and extensions, little is known about the effects of different types of textual data on generated topics. Our research is the first to explore these practical effects and shows that document frequency, document word length, and vocabulary size have mixed practical effects on topic coherence and human topic ranking of LDA topics. We furthermore show that large document collections are less affected by incorrect or noise terms being part of the topic-word distributions, causing topics to be more coherent and ranked higher. Differences between abstract and full-text data are more apparent within small document collections, with differences as large as 90% high-quality topics for full-text data, compared to 50% high-quality topics for abstract data.
引用
收藏
页码:165 / 174
页数:10
相关论文
共 44 条
[1]  
Aletras N., 2013, P 10 INT C COMP SEM, P13
[2]   Antipodean agricultural and resource economics at 60: agricultural innovation [J].
Alston, Julian M. ;
Pardey, Philip G. .
AUSTRALIAN JOURNAL OF AGRICULTURAL AND RESOURCE ECONOMICS, 2016, 60 (04) :554-568
[3]  
[Anonymous], 2010, P 24 INT C NEUR INF
[4]  
[Anonymous], HIST ANAL FIELD OR M
[5]  
[Anonymous], 2009, TEXT MINING CLASSIFI, DOI [DOI 10.1145/1141844.1143859, DOI 10.1201/9781420059458.CH4]
[6]  
Asuncion A., 2012, ARXIV12052662, P27
[7]   Variational Inference for Dirichlet Process Mixtures [J].
Blei, David M. ;
Jordan, Michael I. .
BAYESIAN ANALYSIS, 2006, 1 (01) :121-143
[8]   Probabilistic Topic Models [J].
Blei, David M. .
COMMUNICATIONS OF THE ACM, 2012, 55 (04) :77-84
[9]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022
[10]  
Bouma G., 2009, German Society for Computation Linguistics and Language Technology, V30, P31, DOI DOI 10.1007/BF02774984