A probabilistic topic model based on short distance Co-occurrences

被引:7
作者
Rahimi, Marziea [1 ]
Zahedi, Morteza [1 ]
Mashayekhi, Hoda [1 ]
机构
[1] Shahrood Univ Technol, Fac Comp Engn, Shahrood 3619995161, Iran
关键词
Probabilistic topic model; Latent Dirichlet Allocation; Document clustering; Context window; Local co-occurrence; Word order; NOISY TEXT; DISCOVERY; CLASSIFICATION;
D O I
10.1016/j.eswa.2022.116518
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A limitation of many probabilistic topic models such as Latent Dirichlet Allocation (LDA) is their inflexibility to use local contexts. As a result, these models cannot directly benefit from short-distance co-occurrences, which are more likely to be indicators of meaningful word relationships. Some models such as the Bigram Topic Model (BTM) consider local context by integrating language and topic models. However, due to taking the exact word order into account, such models suffer severely from sparseness. Some other models like Latent Dirichlet Co-Clustering (LDCC) try to solve the problem by adding another level of granularity assuming a document as a bag of segments, while ignoring the word order. In this paper, we introduce a new topic model which uses overlapping windows to encode local word relationships. In the proposed model, we assume a document is comprised of fixed-size overlapping windows, and formulate a new generative process accordingly. In the inference procedure, each word is sampled once in only a single window, while influencing the sampling of its other fellow co-occurring words in other windows. Word relationships are discovered in the document level, but the topic of each word is derived considering only its neighbor words in a window, to emphasize local word relationships. By using overlapping windows, without assuming an explicit dependency between adjacent words, we avoid ignoring the word order completely. The proposed model is straightforward, not severely prone to sparseness and as the experimental results show, produces more meaningful and more coherent topics compared to the three mentioned established models.
引用
收藏
页数:14
相关论文
共 88 条
[31]  
Harabagiu S., 2005, SIGIR 2005. Proceedings of the Twenty-Eighth Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, P202, DOI 10.1145/1076034.1076071
[32]   FastBTM: Reducing the sampling time for biterm topic model [J].
He, Xingwei ;
Xu, Hua ;
Li, Jia ;
He, Liu ;
Yu, Linlin .
KNOWLEDGE-BASED SYSTEMS, 2017, 132 :11-20
[33]  
Henrichs A, 2019, CRIT-Q LIT ART, V61, P387
[34]   Supervised topic models with word order structure for document classification and retrieval learning [J].
Jameel, Shoaib ;
Lam, Wai ;
Bing, Lidong .
INFORMATION RETRIEVAL JOURNAL, 2015, 18 (04) :283-330
[35]   Overlapped latent Dirichlet allocation for efficient image segmentation [J].
Jeong, Young-Seob ;
Choi, Ho-Jin .
SOFT COMPUTING, 2015, 19 (04) :829-838
[36]   Cloud service recommendation based on unstructured textual information [J].
Jiang, Yuanchun ;
Tao, Dandan ;
Liu, Yezheng ;
Sun, Jianshan ;
Ling, Haifeng .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2019, 97 :387-396
[37]   Link-topic model for biomedical abbreviation disambiguation [J].
Kim, Seonho ;
Yoon, Juntae .
JOURNAL OF BIOMEDICAL INFORMATICS, 2015, 53 :367-380
[38]   TWILITE: A recommendation system for Twitter using a probabilistic model based on latent Dirichlet allocation [J].
Kim, Younghoon ;
Shim, Kyuseok .
INFORMATION SYSTEMS, 2014, 42 :59-77
[39]  
Lau J. H., 2013, ACM T SPEECH LANGUAG, V10, P1, DOI [10.1145/2483969.2483972, DOI 10.1145/2483969.2483972]
[40]   GDTM: A Gaussian Dynamic Topic Model for Forwarding Prediction Under Complex Mechanisms [J].
Li, Qian ;
Liu, Liangyun ;
Xu, Ming ;
Wu, Bin ;
Xiao, Yunpeng .
IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2019, 6 (02) :338-349