Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis

被引:5
|
作者
Biesialska, Magdalena [1 ]
Biesialska, Katarzyna [1 ]
Rybinski, Henryk [2 ]
机构
[1] Univ Politecn Cataluna, Barcelona, Spain
[2] Warsaw Univ Technol, Warsaw, Poland
关键词
Sentiment classification; Word embeddings; Transformer;
D O I
10.1007/s10844-021-00664-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
People express their opinions and views in different and often ambiguous ways, hence the meaning of their words is often not explicitly stated and frequently depends on the context. Therefore, it is difficult for machines to process and understand the information conveyed in human languages. This work addresses the problem of sentiment analysis (SA). We propose a simple yet comprehensive method which uses contextual embeddings and a self-attention mechanism to detect and classify sentiment. We perform experiments on reviews from different domains, as well as on languages from three different language families, including morphologically rich Polish and German. We show that our approach is on a par with state-of-the-art models or even outperforms them in several cases. Our work also demonstrates the superiority of models leveraging contextual embeddings. In sum, in this paper we make a step towards building a universal, multilingual sentiment classifier.
引用
收藏
页码:601 / 626
页数:26
相关论文
共 50 条
  • [41] Enhancing ENSO predictions with self-attention ConvLSTM and temporal embeddings
    Rui, Chuang
    Sun, Zhengya
    Zhang, Wensheng
    Liu, An-An
    Wei, Zhiqiang
    FRONTIERS IN MARINE SCIENCE, 2024, 11
  • [42] RoBERTa, ResNeXt and BiLSTM with self-attention: The ultimate trio for customer sentiment analysis
    Lak, Amir Jabbary
    Boostani, Reza
    Alenizi, Farhan A.
    Mohammed, Amin Salih
    Fakhrahmad, Seyed Mostafa
    APPLIED SOFT COMPUTING, 2024, 164
  • [43] Sentiment Analysis Model Based on Self-Attention and Character-Level Embedding
    Xia, Hongbin
    Ding, Chenhui
    Liu, Yuan
    IEEE ACCESS, 2020, 8 : 184614 - 184620
  • [44] Self-Attention Bi-LSTM Networks for Radar Signal Modulation Recognition
    Wei, Shunjun
    Qu, Qizhe
    Zeng, Xiangfeng
    Liang, Jiadian
    Shi, Jun
    Zhang, Xiaoling
    IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES, 2021, 69 (11) : 5160 - 5172
  • [45] Multi-entity sentiment analysis using self-attention based hierarchical dilated convolutional neural network
    Gan, Chenquan
    Wang, Lu
    Zhang, Zufan
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2020, 112 : 116 - 125
  • [46] SELF-ATTENTION NEURAL BAG-OF-FEATURES
    Chumachenko, Kateryna
    Iosifidis, Alexandros
    Gabbouj, Moncef
    2022 IEEE 32ND INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2022,
  • [47] Universal Graph Transformer Self-Attention Networks
    Dai Quoc Nguyen
    Tu Dinh Nguyen
    Dinh Phung
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 193 - 196
  • [48] Context-Aware Self-Attention Networks
    Yang, Baosong
    Li, Jian
    Wong, Derek F.
    Chao, Lidia S.
    Wang, Xing
    Tu, Zhaopeng
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 387 - 394
  • [49] CSAN: Contextual Self-Attention Network for User Sequential Recommendation
    Huang, Xiaowen
    Qian, Shengsheng
    Fang, Quan
    Sang, Jitao
    Xu, Changsheng
    PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 447 - 455
  • [50] Feature Importance Estimation with Self-Attention Networks
    Skrlj, Blaz
    Dzeroski, Saso
    Lavrac, Nada
    Petkovic, Matej
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1491 - 1498