A Local context focus learning model for joint multi-task using syntactic dependency relative distance

被引:7
作者
Qi, Rui-Hua [1 ,2 ]
Yang, Ming-Xin [1 ]
Jian, Yue [1 ]
Li, Zheng-Guang [1 ,2 ]
Chen, Heng [1 ,2 ]
机构
[1] Dalian Univ Foreign Languages, Sch Software, Dalian 116044, Peoples R China
[2] Dalian Univ Foreign Languages, Res Ctr Language Intelligence, Dalian 116044, Peoples R China
关键词
Sentiment analysis; Aspect; Multi-task; Multi-head attention; Local context focus; SENTIMENT; EXTRACTION; CLASSIFICATION; ATTENTION; NETWORK;
D O I
10.1007/s10489-022-03684-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect-based sentiment analysis (ABSA) is a significant task in natural language processing. Although many ABSA systems have been proposed, the correlation between the aspect's sentiment polarity and local context semantic information was not a point of focus. Moreover, aspect term extraction and aspect sentiment classification are fundamental tasks of aspect-based sentiment analysis. However, most existing systems have failed to recognize the natural relation between these two tasks and therefore treat them as relatively independent tasks. In this work, a local context focus method is proposed. It represents semantic distance using syntactic dependency relative distance which is calculated on the basis of an undirected dependency graph. We introduced this method into a multi-task learning framework with a multi-head attention mechanism for aspect term extraction and aspect sentiment classification joint task. Compared with existing models, the proposed local context focus method measures the semantic distance more precisely and helps our model capture more effective local semantic information. In addition, a multi-head attention mechanism is employed to further enhance local semantic representation. Furthermore, the proposed model makes full use of aspect terminology information and aspect sentiment information provided by the two subtasks, thereby improving the overall performance. The experimental results on four datasets show that the proposed model outperforms single task and multi-task models on the aspect term extraction and aspect sentiment classification tasks.
引用
收藏
页码:4145 / 4161
页数:17
相关论文
共 34 条
[21]   Aspect extraction for opinion mining with a deep convolutional neural network [J].
Poria, Soujanya ;
Cambria, Erik ;
Gelbukh, Alexander .
KNOWLEDGE-BASED SYSTEMS, 2016, 108 :42-49
[22]  
Poria Soujanya., 2014, Proceedings of the second workshop on natural language processing for social media (SocialNLP), P28
[23]   Targeted Sentiment Classification with Attentional Encoder Network [J].
Song, Youwei ;
Wang, Jiahai ;
Jiang, Tao ;
Liu, Zhiyue ;
Rao, Yanghui .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: TEXT AND TIME SERIES, PT IV, 2019, 11730 :93-103
[24]  
Subramanian S., 2018, P 6 INT C LEARN REPR
[25]  
Vaswani A, 2017, ADV NEUR IN, V30
[26]  
Wang Fei, 2018, IEEE IND APPL SOC AN
[27]   Surrogate-Assisted Evolutionary Multitasking for Expensive Minimax Optimization in Multiple Scenarios [J].
Wang, Handing ;
Feng, Liang ;
Jin, Yaochu ;
Doherty, John .
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2021, 16 (01) :34-48
[28]  
Wang WY, 2017, AAAI CONF ARTIF INTE, P3316
[29]   Imbalanced Sentiment Classification with Multi-Task Learning [J].
Wu, Fangzhao ;
Wu, Chuhan ;
Liu, Junxin .
CIKM'18: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2018, :1631-1634
[30]  
Xu H, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P2324