A Local context focus learning model for joint multi-task using syntactic dependency relative distance

被引:7
作者
Qi, Rui-Hua [1 ,2 ]
Yang, Ming-Xin [1 ]
Jian, Yue [1 ]
Li, Zheng-Guang [1 ,2 ]
Chen, Heng [1 ,2 ]
机构
[1] Dalian Univ Foreign Languages, Sch Software, Dalian 116044, Peoples R China
[2] Dalian Univ Foreign Languages, Res Ctr Language Intelligence, Dalian 116044, Peoples R China
关键词
Sentiment analysis; Aspect; Multi-task; Multi-head attention; Local context focus; SENTIMENT; EXTRACTION; CLASSIFICATION; ATTENTION; NETWORK;
D O I
10.1007/s10489-022-03684-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect-based sentiment analysis (ABSA) is a significant task in natural language processing. Although many ABSA systems have been proposed, the correlation between the aspect's sentiment polarity and local context semantic information was not a point of focus. Moreover, aspect term extraction and aspect sentiment classification are fundamental tasks of aspect-based sentiment analysis. However, most existing systems have failed to recognize the natural relation between these two tasks and therefore treat them as relatively independent tasks. In this work, a local context focus method is proposed. It represents semantic distance using syntactic dependency relative distance which is calculated on the basis of an undirected dependency graph. We introduced this method into a multi-task learning framework with a multi-head attention mechanism for aspect term extraction and aspect sentiment classification joint task. Compared with existing models, the proposed local context focus method measures the semantic distance more precisely and helps our model capture more effective local semantic information. In addition, a multi-head attention mechanism is employed to further enhance local semantic representation. Furthermore, the proposed model makes full use of aspect terminology information and aspect sentiment information provided by the two subtasks, thereby improving the overall performance. The experimental results on four datasets show that the proposed model outperforms single task and multi-task models on the aspect term extraction and aspect sentiment classification tasks.
引用
收藏
页码:4145 / 4161
页数:17
相关论文
共 34 条
[1]   Multi-task learning for aspect term extraction and aspect sentiment classification [J].
Akhtar, Md Shad ;
Garg, Tarun ;
Ekbal, Asif .
NEUROCOMPUTING, 2020, 398 :247-256
[2]  
[Anonymous], AB
[3]  
[Anonymous], ABOUT US
[4]  
Chen Z, 2021, 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), P317
[5]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[6]  
Fan FF, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P3433
[7]  
He RD, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P504
[8]   An Unsupervised Neural Attention Model for Aspect Extraction [J].
He, Ruidan ;
Lee, Wee Sun ;
Ng, Hwee Tou ;
Dahlmeier, Daniel .
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, :388-397
[9]  
Li X, 2017, P 2017 C EMP METH NA, DOI DOI 10.18653/V1/D17-1310
[10]  
Li X, 2019, AAAI CONF ARTIF INTE, P6714