Two-Stage Attention Network for Aspect-Level Sentiment Classification

被引:1
作者
Gao, Kai [1 ,2 ]
Xu, Hua [1 ]
Gao, Chengliang [1 ,2 ]
Sun, Xiaomin [1 ]
Deng, Junhui [1 ]
Zhang, Xiaoming [2 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Hebei Univ Sci & Technol, Sch Informat Sci & Engn, Shijiazhuang 050018, Hebei, Peoples R China
来源
NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV | 2018年 / 11304卷
基金
美国国家科学基金会;
关键词
Attention mechanism; LSTM; Text representation; Aspect-level sentiment classification;
D O I
10.1007/978-3-030-04212-7_27
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Currently, most of attention-based works adopt single-stage attention processes during generating context representations toward aspect, but their work lacks the deliberation process: A generated and aspect-related representation is directly used as final output without further polishing. In this work, we introduce the deliberation process to model context for further polishing of attention weights, and then propose a two-stage attention network for aspect-level sentiment classification. The network uses of a two-level attention model with LSTM, where the first-stage attention generates a raw aspect-related representation and the second-stage attention polishes and refines the raw representation by deliberation process. Since the deliberation component has global information what the representation to be generated might be, it has the potential to generate a better aspect-related representation by secondly looking into hidden state produced by LSTM. Experimental results on the dataset of SemEval-2016 task 5 about Laptop indicates that our model achieved the state-of-the-art accuracy of 76.56%.
引用
收藏
页码:316 / 325
页数:10
相关论文
共 14 条
[1]  
[Anonymous], 2016, Machine comprehension using match-lstm and answer pointer
[2]  
Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
[3]   Deep learning [J].
LeCun, Yann ;
Bengio, Yoshua ;
Hinton, Geoffrey .
NATURE, 2015, 521 (7553) :436-444
[4]  
Liu P, 2016, 2016 17TH INTERNATIONAL CONFERENCE ON ELECTRONIC PACKAGING TECHNOLOGY (ICEPT), P1480, DOI 10.1109/ICEPT.2016.7583403
[5]  
Luong T., 2015, Effective approaches to attentionbased neural machine translation, P1412
[6]   Linguistically Regularized LSTM for Sentiment Classification [J].
Qian, Qiao ;
Huang, Minlie ;
Lei, Jinhao ;
Zhu, Xiaoyan .
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, :1679-1689
[7]  
Ruder S, 2016, ARXIV160902745
[8]  
Tang D, 2015, ARXIV151201100
[9]  
Tang D., 2016, 2016 C EMP METH NAT, P214, DOI DOI 10.18653/V1/D16-1021
[10]   Dyadic Memory Networks for Aspect-based Sentiment Analysis [J].
Tay, Yi ;
Tuan, Luu Anh ;
Hui, Siu Cheung .
CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, :107-116