Interactive Dual Attention Network for Text Sentiment Classification

被引:16
作者
Zhu, Yinglin [1 ]
Zheng, Wenbin [1 ,2 ]
Tang, Hong [3 ]
机构
[1] Chengdu Univ Informat Technol, Coll Software Engn, Chengdu 610225, Peoples R China
[2] Software Automat Generat & Intelligent Serv Key L, Chengdu 610225, Peoples R China
[3] Sichuan Normal Univ, Coll Engn, Chengdu 610068, Peoples R China
关键词
LSTM;
D O I
10.1155/2020/8858717
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Text sentiment classification is an essential research field of natural language processing. Recently, numerous deep learning-based methods for sentiment classification have been proposed and achieved better performances compared with conventional machine learning methods. However, most of the proposed methods ignore the interactive relationship between contextual semantics and sentimental tendency while modeling their text representation. In this paper, we propose a novel Interactive Dual Attention Network (IDAN) model that aims to interactively learn the representation between contextual semantics and sentimental tendency information. Firstly, we design an algorithm that utilizes linguistic resources to obtain sentimental tendency information from text and then extract word embeddings from the BERT (Bidirectional Encoder Representations from Transformers) pretraining model as the embedding layer of IDAN. Next, we use two Bidirectional LSTM (BiLSTM) networks to learn the long-range dependencies of contextual semantics and sentimental tendency information, respectively. Finally, two types of attention mechanisms are implemented in IDAN. One is multihead attention, which is the next layer of BiLSTM and is used to learn the interactive relationship between contextual semantics and sentimental tendency information. The other is global attention that aims to make the model focus on the important parts of the sequence and generate the final representation for classification. These two attention mechanisms enable IDAN to interactively learn the relationship between semantics and sentimental tendency information and improve the classification performance. A large number of experiments on four benchmark datasets show that our IDAN model is superior to competitive methods. Moreover, both the result analysis and the attention weight visualization further demonstrate the effectiveness of our proposed method.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] DCCL: Dual-channel hybrid neural network combined with self-attention for text classification
    Li, Chaofan
    Qiong, Liu
    Kai, Ma
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2023, 20 (02) : 1981 - 1992
  • [2] A text classification method based on LSTM and graph attention network
    Wang, Haitao
    Li, Fangbing
    CONNECTION SCIENCE, 2022, 34 (01) : 2466 - 2480
  • [3] Two-Stage Attention Network for Aspect-Level Sentiment Classification
    Gao, Kai
    Xu, Hua
    Gao, Chengliang
    Sun, Xiaomin
    Deng, Junhui
    Zhang, Xiaoming
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV, 2018, 11304 : 316 - 325
  • [4] Optimized Attention-Driven Bidirectional Convolutional Neural Network: Recurrent Neural Network for Facebook Sentiment Classification
    Mahalakshmi, T.
    Beevi, S. Zulaikha
    Navaneethakrishnan, M.
    Ramya, Puppala
    Kumar, Sanjay Nakharu Prasad
    INTERNATIONAL JOURNAL OF BUSINESS DATA COMMUNICATIONS AND NETWORKING, 2024, 19 (01)
  • [5] Sentiment classification via user and product interactive modeling
    Zhou, Xiabing
    Wang, Zhongqing
    Zhou, Min
    Wang, Qifa
    Li, Shoushan
    Zhang, Min
    Zhou, Guodong
    SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (12)
  • [6] Sentiment classification using attention mechanism and bidirectional long short-term memory network
    Wu, Peng
    Li, Xiaotong
    Ling, Chen
    Ding, Shengchun
    Shen, Si
    APPLIED SOFT COMPUTING, 2021, 112
  • [7] Multi-Attention Network for Sentiment Analysis
    Du, Tingting
    Huang, Yunyin
    Wu, Xian
    Chang, Huiyou
    PROCEEDINGS OF THE 2018 2ND INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL (NLPIR 2018), 2018, : 49 - 54
  • [8] Deep Refinement: capsule network with attention mechanism-based system for text classification
    Deepak Kumar Jain
    Rachna Jain
    Yash Upadhyay
    Abhishek Kathuria
    Xiangyuan Lan
    Neural Computing and Applications, 2020, 32 : 1839 - 1856
  • [9] Deep Refinement: capsule network with attention mechanism-based system for text classification
    Jain, Deepak Kumar
    Jain, Rachna
    Upadhyay, Yash
    Kathuria, Abhishek
    Lan, Xiangyuan
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (07) : 1839 - 1856
  • [10] A Multi-Classification Sentiment Analysis Model of Chinese Short Text Based on Gated Linear Units and Attention Mechanism
    Liu, Lei
    Chen, Hao
    Sun, Yinghong
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (06)