DA-BERT: Enhancing Part-of-Speech Tagging of Aspect Sentiment Analysis Using BERT

被引:10
作者
Pei, Songwen [1 ,2 ,3 ]
Wang, Lulu [1 ]
Shen, Tianma [1 ]
Ning, Zhong [2 ]
机构
[1] Univ Shanghai Sci & Technol, Sch Opt Elect & Comp Engn, Shanghai 200093, Peoples R China
[2] Fudan Univ, Sch Management, Shanghai 200433, Peoples R China
[3] Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100190, Peoples R China
来源
ADVANCED PARALLEL PROCESSING TECHNOLOGIES (APPT 2019) | 2019年 / 11719卷
基金
中国国家自然科学基金;
关键词
Aspect sentiment classification; BERT; Deep-attention; Multi-attention; Part-of-speech; Sentiment analysis; Short text;
D O I
10.1007/978-3-030-29611-7_7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the development of Internet, text-based data from web have grown exponentially where the data carry large amount of valuable information. As a vital branch of sentiment analysis, the aspect sentiment analysis of short text on social media has attracted interests of researchers. Aspect sentiment classification is a kind of fine-grained textual sentiment classification. Currently, the attention mechanism is mainly combined with RNN (Recurrent Neural Network) or LSTM (Long Short-Term Memory) networks. Such neural network-based sentiment analysis model not only has a complicated computational structure, but also has computational dependence. To address the above problems and improve the accuracy of the target-based sentiment classification for short text, we propose a neural network model that combines deep-attention with Bidirectional Encoder Representations from Transformers (DA-BERT). The DA-BERT model can fully mine the relationships between target words and emotional words in a sentence, and it does not require syntactic analysis of sentences or external knowledge such as sentiment lexicon. The training speed of the proposed DA-BERT model has been greatly improved while removing the computational dependencies of RNN structure. Compared with LSTM, TD-LSTM, TC-LSTM, AT-LSTM, ATAE-LSTM, and PAT-LSTM, the results of experiments on the dataset SemEval2014 Task4 show that the accuracy of the DA-BERT model is improved by 13.63% on average where the word vector is 300 dimensions in aspect sentiment classification.
引用
收藏
页码:86 / 95
页数:10
相关论文
共 16 条
[1]  
[Anonymous], 2017, 5 INT C LEARNING REP
[2]  
Bahdanau D., 2015, P INT C LEARN REPR, P940
[3]  
Bhatia Parminder, 2015, P 2015 C EMP METH NA, DOI DOI 10.18653/V1/D15-1263
[4]   Deep Learning for Aspect-Based Sentiment Analysis: A Comparative Review [J].
Do, Hai Ha ;
Prasad, P. W. C. ;
Maag, Angelika ;
Alsadoon, Abeer .
EXPERT SYSTEMS WITH APPLICATIONS, 2019, 118 :272-299
[5]  
Graves A, 2012, STUD COMPUT INTELL, V385, P1, DOI [10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
[6]  
Kingma D. P., 2013, arXiv
[7]  
Li L, 2016, IEEE IND ELEC, P3294, DOI 10.1109/IECON.2016.7793303
[8]  
Liu B, 2011, DATA CENTRIC SYST AP, P459, DOI 10.1007/978-3-642-19460-3_11
[9]  
Pei S., 2019, COMPUT ENG SCI, V2, P343
[10]  
Sutskever I, 2014, ADV NEUR IN, V27