What is wrong with topic modeling? And how to fix it using search-based software engineering

被引:128
作者
Agrawal, Amritanshu [1 ]
Fu, Wei [1 ]
Menzies, Tim [1 ]
机构
[1] North Carolina State Univ, Dept Comp Sci, Raleigh, NC 27695 USA
基金
美国国家科学基金会;
关键词
Topic modeling; Stability; LDA; Tuning; Differential evolution; DIFFERENTIAL EVOLUTION; SCATTER SEARCH; CLASSIFICATION;
D O I
10.1016/j.infsof.2018.02.005
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Context: Topic modeling finds human-readable structures in unstructured textual data. A widely used topic modeling technique is Latent Dirichlet allocation. When running on different datasets, LDA suffers from "order effects", i.e., different topics are generated if the order of training data is shuffled. Such order effects introduce a systematic error for any study. This error can relate to misleading results; specifically, inaccurate topic descriptions and a reduction in the efficacy of text mining classification results. Objective: To provide a method in which distributions generated by LDA are more stable and can be used for further analysis. Method: We use LDADE, a search-based software engineering tool which uses Differential Evolution (DE) to tune the LDA's parameters. LDADE is evaluated on data from a programmer information exchange site (Stackoverfiow), title and abstract text of thousands of Software Engineering (SE) papers, and software defect reports from NASA. Results were collected across different implementations of LDA (Python + Scikit-Learn, Scala + Spark) across Linux platform and for different kinds of LDAs (VEM, Gibbs sampling). Results were scored via topic stability and text mining classification accuracy. Results: In all treatments: (i) standard LDA exhibits very large topic instability; (ii) LDADE's tunings dramatically reduce cluster instability; (iii) LDADE also leads to improved performances for supervised as well as unsupervised learning. Conclusion: Due to topic instability, using standard LDA with its "off-the-shelf" settings should now be depreciated. Also, in future, we should require SE papers that use LDA to test and (if needed) mitigate LDA topic instability. Finally, LDADE is a candidate technology for effectively and efficiently reducing that instability.
引用
收藏
页码:74 / 88
页数:15
相关论文
共 86 条
[1]  
Allamanis M, 2013, IEEE WORK CONF MIN S, P53, DOI 10.1109/MSR.2013.6624004
[2]  
[Anonymous], 2006, P 29 ANN INT ACM SIG, DOI DOI 10.1145/1148170.1148204
[3]  
[Anonymous], 2016, The Journal of Machine Learning Research, DOI DOI 10.1145/2882903.2912565
[4]  
[Anonymous], 2000, J ED BEHAV STAT
[5]  
[Anonymous], 2017, ARXIV170300133
[6]  
Asuncion A., 2009, C UNC ART INT UAI QU, P27, DOI DOI 10.1080/10807030390248483
[7]   Mining Search Topics from a Code Search Engine Usage Log [J].
Bajracharya, Sushil ;
Lopes, Cristina .
2009 6TH IEEE INTERNATIONAL WORKING CONFERENCE ON MINING SOFTWARE REPOSITORIES, 2009, :111-120
[8]   What are developers talking about? An analysis of topics and trends in Stack Overflow [J].
Barua, Anton ;
Thomas, Stephen W. ;
Hassan, Ahmed E. .
EMPIRICAL SOFTWARE ENGINEERING, 2014, 19 (03) :619-654
[9]   MOSS multiobjective scatter search applied to non-linear multiple criteria optimization [J].
Beausoleil, RP .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2006, 169 (02) :426-449
[10]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281