Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis

被引:5
|
作者
Biesialska, Magdalena [1 ]
Biesialska, Katarzyna [1 ]
Rybinski, Henryk [2 ]
机构
[1] Univ Politecn Cataluna, Barcelona, Spain
[2] Warsaw Univ Technol, Warsaw, Poland
关键词
Sentiment classification; Word embeddings; Transformer;
D O I
10.1007/s10844-021-00664-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
People express their opinions and views in different and often ambiguous ways, hence the meaning of their words is often not explicitly stated and frequently depends on the context. Therefore, it is difficult for machines to process and understand the information conveyed in human languages. This work addresses the problem of sentiment analysis (SA). We propose a simple yet comprehensive method which uses contextual embeddings and a self-attention mechanism to detect and classify sentiment. We perform experiments on reviews from different domains, as well as on languages from three different language families, including morphologically rich Polish and German. We show that our approach is on a par with state-of-the-art models or even outperforms them in several cases. Our work also demonstrates the superiority of models leveraging contextual embeddings. In sum, in this paper we make a step towards building a universal, multilingual sentiment classifier.
引用
收藏
页码:601 / 626
页数:26
相关论文
共 50 条
  • [21] Untangling tradeoffs between recurrence and self-attention in neural networks
    Kerg, Giancarlo
    Kanuparthi, Bhargav
    Goyal, Anirudh
    Goyette, Kyle
    Bengio, Yoshua
    Lajoie, Guillaume
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [22] Sentiment analysis for user reviews using Bi-LSTM self-attention based CNN model
    P. Bhuvaneshwari
    A. Nagaraja Rao
    Y. Harold Robinson
    M. N. Thippeswamy
    Multimedia Tools and Applications, 2022, 81 : 12405 - 12419
  • [23] Sparse Self-Attention LSTM for Sentiment Lexicon Construction
    Deng, Dong
    Jing, Liping
    Yu, Jian
    Sun, Shaolong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2019, 27 (11) : 1777 - 1790
  • [24] Self-Attention Generative Adversarial Networks
    Zhang, Han
    Goodfellow, Ian
    Metaxas, Dimitris
    Odena, Augustus
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [25] Leveraging Self-Attention Mechanism for Attitude Estimation in Smartphones
    Brotchie, James
    Shao, Wei
    Li, Wenchao
    Kealy, Allison
    SENSORS, 2022, 22 (22)
  • [26] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Xue-Liang Leng
    Xiao-Ai Miao
    Tao Liu
    Multimedia Tools and Applications, 2021, 80 : 12581 - 12600
  • [27] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Leng, Xue-Liang
    Miao, Xiao-Ai
    Liu, Tao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) : 12581 - 12600
  • [28] Self-Attention Networks for Code Search
    Fang, Sen
    Tan, You-Shuai
    Zhang, Tao
    Liu, Yepang
    INFORMATION AND SOFTWARE TECHNOLOGY, 2021, 134
  • [29] Modeling Localness for Self-Attention Networks
    Yang, Baosong
    Tu, Zhaopeng
    Wong, Derek F.
    Meng, Fandong
    Chao, Lidia S.
    Zhang, Tong
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4449 - 4458
  • [30] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    IEEE ACCESS, 2022, 10 : 129580 - 129587