Graph convolutional network with multiple weight mechanisms for aspect-based sentiment analysis

被引:40
作者
Zhao, Ziguo [1 ]
Tang, Mingwei [1 ]
Tang, Wei [1 ]
Wang, Chunhao [1 ]
Chen, Xiaoliang [1 ]
机构
[1] Xihua Univ, Sch Comp & Software Engn, Chengdu 610039, Peoples R China
基金
中国国家自然科学基金;
关键词
Aspect-based sentiment analysis; Graph convolutional network; Multi-head self attention; BERT;
D O I
10.1016/j.neucom.2022.05.045
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect-based sentiment analysis (ABSA) aims at determining the sentiment polarity of the given aspect term in a sentence. Recently, graph convolution network (GCN) has been used in the ABSA task and obtained promising results. Despite the proliferation of the methods and their success, prevailing models based on GCN lack a powerful constraint mechanism for the message passing to aspect terms, introducing heavy noise during graph convolution. Further, they simply average the subword vectors from BERT to form word-level embeddings, failing to fully exploit the potentials of BERT. To overcome these downsides, a graph convolutional network with multiple weight mechanisms is proposed for aspect-based sentiment analysis in the paper. Specifically, a dynamic weight alignment mechanism is proposed to encourage our model to make full use of BERT. Then an aspect-aware weight mechanism is designed to control message propagation to aspect during graph convolution operation. Afterwards, an aspect oriented loading layer is presented to further reduce adverse effects of words irrelevant with aspect term. Finally, the multi-head self attention is used to fuse high order semantic and syntax information. Hence, the model can obtain the premium aspect-specific representations for prediction. Experiments demonstrate that the proposed model can achieve state-of-the-art results compared to other models.(c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:124 / 134
页数:11
相关论文
共 61 条
[1]   Constructing domain-dependent sentiment dictionary for sentiment analysis [J].
Ahmed, Murtadha ;
Chen, Qun ;
Li, Zhanhuai .
NEURAL COMPUTING & APPLICATIONS, 2020, 32 (18) :14719-14732
[2]  
[Anonymous], 2013, ONE BILLION WORD BEN
[3]  
[Anonymous], 2011, ACM T INTEL SYST TEC, DOI DOI 10.1145/1961189.1961199
[4]  
Bahdanau D., 2015, ICLR
[5]  
Beck D, 2018, PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, P273
[6]  
Cai D, 2020, AAAI CONF ARTIF INTE, V34, P7464
[7]  
Chen P., 2017, P 2017 C EMP METH NA, P452, DOI [10.18653/v1/D17-1047, DOI 10.18653/V1/D17-1047]
[8]  
Cho K., 2014, COMPUT SCI
[9]   On the algorithmic implementation of multiclass kernel-based vector machines [J].
Crammer, K ;
Singer, Y .
JOURNAL OF MACHINE LEARNING RESEARCH, 2002, 2 (02) :265-292
[10]  
Dai JQ, 2021, 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), P1816