Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis

被引:5
|
作者
Biesialska, Magdalena [1 ]
Biesialska, Katarzyna [1 ]
Rybinski, Henryk [2 ]
机构
[1] Univ Politecn Cataluna, Barcelona, Spain
[2] Warsaw Univ Technol, Warsaw, Poland
关键词
Sentiment classification; Word embeddings; Transformer;
D O I
10.1007/s10844-021-00664-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
People express their opinions and views in different and often ambiguous ways, hence the meaning of their words is often not explicitly stated and frequently depends on the context. Therefore, it is difficult for machines to process and understand the information conveyed in human languages. This work addresses the problem of sentiment analysis (SA). We propose a simple yet comprehensive method which uses contextual embeddings and a self-attention mechanism to detect and classify sentiment. We perform experiments on reviews from different domains, as well as on languages from three different language families, including morphologically rich Polish and German. We show that our approach is on a par with state-of-the-art models or even outperforms them in several cases. Our work also demonstrates the superiority of models leveraging contextual embeddings. In sum, in this paper we make a step towards building a universal, multilingual sentiment classifier.
引用
收藏
页码:601 / 626
页数:26
相关论文
共 50 条
  • [31] Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks
    Dasoulas, George
    Scaman, Kevin
    Virmaux, Aladin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [32] Adaptive Feature Self-Attention in Spiking Neural Networks for Hyperspectral Classification
    Li, Heng
    Tu, Bing
    Liu, Bo
    Li, Jun
    Plaza, Antonio
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [33] Combining convolutional neural networks and self-attention for fundus diseases identification
    Wang, Keya
    Xu, Chuanyun
    Li, Gang
    Zhang, Yang
    Zheng, Yu
    Sun, Chengjie
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [34] Sparse self-attention aggregation networks for neural sequence slice interpolation
    Zejin Wang
    Jing Liu
    Xi Chen
    Guoqing Li
    Hua Han
    BioData Mining, 14
  • [35] Original Music Generation using Recurrent Neural Networks with Self-Attention
    Jagannathan, Akash
    Chandrasekaran, Bharathi
    Dutta, Shubham
    Patil, Uma Rameshgouda
    Eirinaki, Magdalini
    2022 FOURTH IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE TESTING (AITEST 2022), 2022, : 56 - 63
  • [36] Spatial-Temporal Self-Attention for Asynchronous Spiking Neural Networks
    Wang, Yuchen
    Shi, Kexin
    Lu, Chengzhuo
    Liu, Yuguo
    Zhang, Malu
    Qu, Hong
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 3085 - 3093
  • [37] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    IEEE Access, 2022, 10 : 129580 - 129587
  • [38] Combining convolutional neural networks and self-attention for fundus diseases identification
    Keya Wang
    Chuanyun Xu
    Gang Li
    Yang Zhang
    Yu Zheng
    Chengjie Sun
    Scientific Reports, 13
  • [39] EPILEPTIC SPIKE DETECTION BY RECURRENT NEURAL NETWORKS WITH SELF-ATTENTION MECHANISM
    Fukumori, Kosuke
    Yoshida, Noboru
    Sugano, Hidenori
    Nakajima, Madoka
    Tanaka, Toshihisa
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 1406 - 1410
  • [40] Sparse self-attention aggregation networks for neural sequence slice interpolation
    Wang, Zejin
    Liu, Jing
    Chen, Xi
    Li, Guoqing
    Han, Hua
    BIODATA MINING, 2021, 14 (01)