Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification

被引:1336
作者
Zhou, Peng [1 ]
Shi, Wei [1 ]
Tian, Jun [1 ]
Qi, Zhenyu [1 ]
Li, Bingchen [1 ]
Hao, Hongwei [1 ]
Xu, Bo [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2016), VOL 2 | 2016年
基金
国家高技术研究发展计划(863计划);
关键词
D O I
10.18653/v1/p16-2034
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation classification is an important semantic processing task in the field of natural language processing (NLP). State-of-the-art systems still rely on lexical resources such as WordNet or NLP systems like dependency parser and named entity recognizers (NER) to get high-level features. Another challenge is that important information can appear at any position in the sentence. To tackle these problems, we propose Attention-Based Bidirectional Long Short-Term Memory Networks(Att-BLSTM) to capture the most important semantic information in a sentence. The experimental results on the SemEval-2010 relation classification task show that our method outperforms most of the existing methods, with only word vectors.
引用
收藏
页码:207 / 212
页数:6
相关论文
共 25 条
  • [1] [Anonymous], 2005, P HUM LANG TECHN C C
  • [2] [Anonymous], 2013, PREPRINT ARXIV 1308
  • [3] [Anonymous], 2010, 5 INT WORKSH SEM EV
  • [4] [Anonymous], 1997, Neural Computation
  • [5] [Anonymous], 2014, EMNLP
  • [6] [Anonymous], 2009, Proceedings of the Workshop on Semantic Evaluations: Recent Achievements and Future Directions, DEW '09
  • [7] [Anonymous], 2015, J COMPUT SCI PROCESS
  • [8] [Anonymous], 2014, ABS14090473 CORR
  • [9] [Anonymous], 2015, ARXIV150803720
  • [10] [Anonymous], 2012, COMPUTER ENCE