Ordering-Sensitive and Semantic-Aware Topic Modeling

被引:0
作者
Yang, Min [1 ]
Cui, Tianyi [2 ]
Tu, Wenting [1 ]
机构
[1] Univ Hong Kong, Hong Kong, Peoples R China
[2] Zhejiang Univ, Hangzhou, Zhejiang, Peoples R China
来源
PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE | 2015年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Topic modeling of textual corpora is an important and challenging problem. In most previous work, the "bag-of-words" assumption is usually made which ignores the ordering of words. This assumption simplifies the computation, but it unrealistically loses the ordering information and the semantic of words in the context. In this paper, we present a Gaussian Mixture Neural Topic Model (GMNTM) which incorporates both the ordering of words and the semantic meaning of sentences into topic modeling. Specifically, we represent each topic as a cluster of multi-dimensional vectors and embed the corpus into a collection of vectors generated by the Gaussian mixture model. Each word is affected not only by its topic, but also by the embedding vector of its surrounding words and the context. The Gaussian mixture components and the topic of documents, sentences and words can be learnt jointly. Extensive experiments show that our model can learn better topics and more accurate word distributions for each topic. Quantitatively, comparing to state-of-the-art topic modeling approaches, GMNTM obtains significantly better performance in terms of perplexity, retrieval accuracy and classification accuracy.
引用
收藏
页码:2353 / 2359
页数:7
相关论文
共 32 条
[1]  
[Anonymous], 2006, P 23 INT C MACH LEAR, DOI DOI 10.1145/1143844.1143967
[2]  
[Anonymous], 2009, Advances in neural information processing systems
[3]  
[Anonymous], 2004, Proceedings of the International Conference on Knowledge Discovery and Data Mining (SIGKDD), DOI [10.1145/1014052, DOI 10.1145/1014052]
[4]  
[Anonymous], 2013, UAI 2013
[5]  
[Anonymous], 2006, Pattern recognition and machine learning
[6]   Probabilistic Topic Models [J].
Blei, David M. .
COMMUNICATIONS OF THE ACM, 2012, 55 (04) :77-84
[7]  
Blei DM, 2004, ADV NEUR IN, V16, P17
[8]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022
[9]  
Gruber Amit, 2007, AISTATS, P163
[10]  
Hinton Geoffrey E., 2009, Advances in neural information processing systems, P1607