Hierarchical Neural Language Models for Joint Representation of Streaming Documents and their Content

被引:50
作者
Djuric, Nemanja [1 ]
Wu, Hao [1 ,2 ]
Radosavljevic, Vladan [1 ]
Grbovic, Mihajlo [1 ]
Bhamidipati, Narayan [1 ]
机构
[1] Yahoo Labs, Sunnyvale, CA USA
[2] Univ Southern Calif, Los Angeles, CA USA
来源
PROCEEDINGS OF THE 24TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW 2015) | 2015年
关键词
Machine learning; document modeling; distributed representations; word embeddings; document embeddings;
D O I
10.1145/2736277.2741643
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the problem of learning distributed representations for documents in data streams. The documents are represented as low-dimensional vectors and are jointly learned with distributed vector representations of word tokens using a hierarchical framework with two embedded neural language models. In particular, we exploit the context of documents in streams and use one of the language models to model the document sequences, and the other to model word sequences within them. The models learn continuous vector representations for both word tokens and documents such that semantically similar documents and words are close in a common vector space. We discuss extensions to our model, which can be applied to personalized recommendation and social relationship mining by adding further user layers to the hierarchy, thus learning user-specific vectors to represent individual preferences. We validated the learned representations on a public movie rating data set from MovieLens, as well as on a large-scale Yahoo News data comprising three months of user activity logs collected on Yahoo servers. The results indicate that the proposed model can learn useful representations of both documents and word tokens, outperforming the current state-of-the-art by a large margin.
引用
收藏
页码:248 / 255
页数:8
相关论文
共 20 条
  • [11] Hoffman M. D., 2010, Advances in Neural Information Processing Systems, V23, P856, DOI DOI 10.5555/2997189.2997285
  • [12] Probabilistic latent semantic indexing
    Hofmann, T
    [J]. SIGIR'99: PROCEEDINGS OF 22ND INTERNATIONAL CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 1999, : 50 - 57
  • [13] Kiros R, 2014, PR MACH LEARN RES, V32, P595
  • [14] Le Q. V., 2014, ARXIV14062710
  • [15] Mikolov T., 2013, INT C LEARN REPR, DOI DOI 10.48550/ARXIV.1301.3781
  • [16] Mikolov T., 2013, Advances in neural information processing systems, P3111, DOI [10.18653/v1/d16-1146, DOI 10.18653/V1/D16-1146]
  • [17] Mnih A., 2012, P 29 INT COF INT C M
  • [18] Rfou R.A., 2014, ARXIV14036652
  • [19] Socher R, 2013, Proceedings of NIPS, P926
  • [20] 2014, INT WORLD WID WEB C, P283, DOI DOI 10.1145/2567948.2577351