Encoding Syntactic Knowledge in Neural Networks for Sentiment Classification

被引:80
作者
Huang, Minlie [1 ]
Qian, Qiao [1 ]
Zhu, Xiaoyan [1 ]
机构
[1] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Natl Lab Informat Sci & Technol, Dept Comp Sci & Technol, Beijing 10084, Peoples R China
基金
美国国家科学基金会;
关键词
Neural networks; recursive neural network; long short-term memory; deep learning; representation learning; sentiment classification; sentiment analysis; REPRESENTATION;
D O I
10.1145/3052770
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Phrase/Sentence representation is one of the most important problems in natural language processing. Many neural network models such as Convolutional Neural Network (CNN), Recursive Neural Network (RNN), and Long Short-Term Memory (LSTM) have been proposed to learn representations of phrase/sentence, however, rich syntactic knowledge has not been fully explored when composing a longer text from its shorter constituent words. Inmost traditional models, only word embeddings are utilized to compose phrase/sentence representations, while the syntactic information of words is yet to be explored. In this article, we discover that encoding syntactic knowledge (part-of-speech tag) in neural networks can enhance sentence/phrase representation. Specifically, we propose to learn tag-specific composition functions and tag embeddings in recursive neural networks, and propose to utilize POS tags to control the gates of tree-structured LSTM networks. We evaluate these models on two benchmark datasets for sentiment classification, and demonstrate that improvements can be obtained with such syntactic knowledge encoded.
引用
收藏
页数:27
相关论文
共 51 条
  • [1] [Anonymous], 2013, ACL
  • [2] [Anonymous], 2015, ARXIV150304069
  • [3] [Anonymous], 2012, Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Short Papers-Volume 2, ACL '12
  • [4] [Anonymous], P 53 ANN M ASS COMP
  • [5] [Anonymous], GOOGL MOUNT VIEW APR
  • [6] [Anonymous], 2015, ARXIV150601057
  • [7] [Anonymous], 2013, CORR
  • [8] [Anonymous], 2011, Proceedings of the Conference on Empirical Methods in Natural Language Processing, EMNLP '11
  • [9] [Anonymous], 2011, Proceedings of the Conference on Empirical Methods in Natural Language Processing
  • [10] [Anonymous], 2015, P 2015 C N AM CHAPT