DA-BERT: Enhancing Part-of-Speech Tagging of Aspect Sentiment Analysis Using BERT

被引:9
作者
Pei, Songwen [1 ,2 ,3 ]
Wang, Lulu [1 ]
Shen, Tianma [1 ]
Ning, Zhong [2 ]
机构
[1] Univ Shanghai Sci & Technol, Sch Opt Elect & Comp Engn, Shanghai 200093, Peoples R China
[2] Fudan Univ, Sch Management, Shanghai 200433, Peoples R China
[3] Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100190, Peoples R China
来源
ADVANCED PARALLEL PROCESSING TECHNOLOGIES (APPT 2019) | 2019年 / 11719卷
基金
中国国家自然科学基金;
关键词
Aspect sentiment classification; BERT; Deep-attention; Multi-attention; Part-of-speech; Sentiment analysis; Short text;
D O I
10.1007/978-3-030-29611-7_7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the development of Internet, text-based data from web have grown exponentially where the data carry large amount of valuable information. As a vital branch of sentiment analysis, the aspect sentiment analysis of short text on social media has attracted interests of researchers. Aspect sentiment classification is a kind of fine-grained textual sentiment classification. Currently, the attention mechanism is mainly combined with RNN (Recurrent Neural Network) or LSTM (Long Short-Term Memory) networks. Such neural network-based sentiment analysis model not only has a complicated computational structure, but also has computational dependence. To address the above problems and improve the accuracy of the target-based sentiment classification for short text, we propose a neural network model that combines deep-attention with Bidirectional Encoder Representations from Transformers (DA-BERT). The DA-BERT model can fully mine the relationships between target words and emotional words in a sentence, and it does not require syntactic analysis of sentences or external knowledge such as sentiment lexicon. The training speed of the proposed DA-BERT model has been greatly improved while removing the computational dependencies of RNN structure. Compared with LSTM, TD-LSTM, TC-LSTM, AT-LSTM, ATAE-LSTM, and PAT-LSTM, the results of experiments on the dataset SemEval2014 Task4 show that the accuracy of the DA-BERT model is improved by 13.63% on average where the word vector is 300 dimensions in aspect sentiment classification.
引用
收藏
页码:86 / 95
页数:10
相关论文
共 50 条
  • [31] Are Modern Deep Learning Models for Sentiment Analysis Brittle? An Examination on Part-of-Speech
    Alhazmi, Ahoud
    Zhang, Wei Emma
    Sheng, Quan Z.
    Aljubairy, Abdulwahab
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [32] Sentiment Analysis of Customer Comments in Banking using BERT-based Approaches
    Masarifoglu, Melik
    Tigrak, Umit
    Hakyemez, Sefa
    Gul, Guven
    Bozan, Erdal
    Buyuklu, Ali Hakan
    Ozgur, Arzucan
    [J]. 29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [33] Vaccine sentiment analysis using BERT + NBSVM and geo-spatial approaches
    Areeba Umair
    Elio Masciari
    Muhammad Habib Ullah
    [J]. The Journal of Supercomputing, 2023, 79 : 17355 - 17385
  • [34] Sentiment analysis of imbalanced datasets using BERT and ensemble stacking for deep learning
    Habbat, Nassera
    Nouri, Hicham
    Anoun, Houda
    Hassouni, Larbi
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126
  • [35] Enhancing Product Design through AI-Driven Sentiment Analysis of Amazon Reviews Using BERT
    Shaik Vadla, Mahammad Khalid
    Suresh, Mahima Agumbe
    Viswanathan, Vimal K.
    [J]. ALGORITHMS, 2024, 17 (02)
  • [36] Aspect-based Sentiment Analysis for Bengali Text using Bidirectional Encoder Representations from Transformers (BERT)
    Samia, Moythry Manir
    Rajee, Alimul
    Hasan, Md Rakib
    Faruq, Mohammad Omar
    Paul, Pintu Chandra
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (12) : 978 - 986
  • [37] Enhancing Sentiment Analysis for Chinese Texts Using a BERT-Based Model with a Custom Attention Mechanism
    Ding, Linlin
    Han, Yiming
    Li, Mo
    Li, Dong
    [J]. WEB INFORMATION SYSTEMS AND APPLICATIONS, WISA 2024, 2024, 14883 : 172 - 179
  • [38] CABiLSTM-BERT: Aspect-based sentiment analysis model based on deep implicit feature extraction
    He, Bo
    Zhao, Ruoyu
    Tang, Dali
    [J]. KNOWLEDGE-BASED SYSTEMS, 2025, 309
  • [39] An enhanced guided LDA model augmented with BERT based semantic strength for aspect term extraction in sentiment analysis
    Venugopalan, Manju
    Gupta, Deepa
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 246
  • [40] Sentiment analysis on the impact of coronavirus in social life using the BERT model
    Mrityunjay Singh
    Amit Kumar Jakhar
    Shivam Pandey
    [J]. Social Network Analysis and Mining, 2021, 11