Probabilistic topic models for sequence data

被引:21
作者
Barbieri, Nicola [1 ]
Manco, Giuseppe [2 ]
Ritacco, Ettore [2 ]
Carnuccio, Marco [3 ]
Bevacqua, Antonio [3 ]
机构
[1] Yahoo Res, Barcelona, Spain
[2] Italian Natl Res Council, Inst High Performance Comp & Networks ICAR, I-87036 Arcavacata Di Rende, CS, Italy
[3] Univ Calabria, Dept Elect Informat & Syst, I-87036 Arcavacata Di Rende, CS, Italy
关键词
Recommender systems; Collaborative filtering; Probabilistic topic models; Performance;
D O I
10.1007/s10994-013-5391-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Probabilistic topic models are widely used in different contexts to uncover the hidden structure in large text corpora. One of the main (and perhaps strong) assumption of these models is that generative process follows a bag-of-words assumption, i.e. each token is independent from the previous one. We extend the popular Latent Dirichlet Allocation model by exploiting three different conditional Markovian assumptions: (i) the token generation depends on the current topic and on the previous token; (ii) the topic associated with each observation depends on topic associated with the previous one; (iii) the token generation depends on the current and previous topic. For each of these modeling assumptions we present a Gibbs Sampling procedure for parameter estimation. Experimental evaluation over real-word data shows the performance advantages, in terms of recall and precision, of the sequence-modeling approaches.
引用
收藏
页码:5 / 29
页数:25
相关论文
共 32 条
[1]  
[Anonymous], 2006, P 23 INT C MACH LEAR, DOI DOI 10.1145/1143844.1143967
[2]  
[Anonymous], 2005, NIPS
[3]  
[Anonymous], 2008, Parameter estimation for text analysis
[4]  
[Anonymous], 2006, Pattern recognition and machine learning
[5]  
[Anonymous], 2011, 2011 ACM C REC SYST
[6]  
Bambini R, 2011, RECOMMENDER SYSTEMS HANDBOOK, P299, DOI 10.1007/978-0-387-85820-3_9
[7]  
Barbieri N., 2013, Proc. ACM Intl. Conf. on Web search and data mining (WSDM), P33, DOI DOI 10.1145/2433396.2433403
[8]  
Barbieri N., 2012, P 2012 SIAM INT C DA
[9]  
Barbieri N, 2011, LECT NOTES ARTIF INT, V6911, P172, DOI 10.1007/978-3-642-23780-5_21
[10]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022