Incorporating Dynamic Semantics into Pre-Trained Language Model for Aspect-based Sentiment Analysis

被引:0
|
作者
Zhang, Kai [1 ]
Zhang, Kun [2 ]
Zhang, Mengdi [3 ]
Zhao, Hongke [4 ]
Liu, Qi [1 ]
Wu, Wei [3 ]
Chen, Enhong [1 ]
机构
[1] Univ Sci & Technol China, Sch Data Sci, Hefei, Peoples R China
[2] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Hefei, Peoples R China
[3] Meituan, Beijing, Peoples R China
[4] Tianjin Univ, Coll Management & Econ, Tianjin, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
CONSCIOUSNESS; INFORMATION; ATTENTION; NETWORK;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect-based sentiment analysis (ABSA) predicts sentiment polarity towards a specific aspect in the given sentence. While pre-trained language models such as BERT have achieved great success, incorporating dynamic semantic changes into ABSA remains challenging. To this end, in this paper, we propose to address this problem by Dynamic Re-weighting BERT (DR-BERT), a novel method designed to learn dynamic aspect-oriented semantics for ABSA. Specifically, we first take the Stack-BERT layers as a primary encoder to grasp the overall semantic of the sentence and then fine-tune it by incorporating a lightweight Dynamic Reweighting Adapter (DRA). Note that the DRA can pay close attention to a small region of the sentences at each step and re-weigh the vitally important words for better aspect-aware sentiment understanding. Finally, experimental results on three benchmark datasets demonstrate the effectiveness and the rationality of our proposed model and provide good interpretable insights for future semantic modeling.
引用
收藏
页码:3599 / 3610
页数:12
相关论文
共 50 条
  • [1] Aspect-Based Sentiment Analysis of Social Media Data With Pre-Trained Language Models
    Troya, Anina
    Pillai, Reshmi Gopalakrishna
    Rivero, Cristian Rodriguez
    Genc, Zulkuf
    Kayal, Subhradeep
    Araci, Dogu
    2021 5TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2021, 2021, : 8 - 17
  • [2] Aspect-Based Sentiment Analysis in Hindi Language by Ensembling Pre-Trained mBERT Models
    Pathak, Abhilash
    Kumar, Sudhanshu
    Roy, Partha Pratim
    Kim, Byung-Gyu
    ELECTRONICS, 2021, 10 (21)
  • [3] Aspect Based Sentiment Analysis by Pre-trained Language Representations
    Liang Tianxin
    Yang Xiaoping
    Zhou Xibo
    Wang Bingqian
    2019 IEEE INTL CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, BIG DATA & CLOUD COMPUTING, SUSTAINABLE COMPUTING & COMMUNICATIONS, SOCIAL COMPUTING & NETWORKING (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2019), 2019, : 1262 - 1265
  • [4] LETS: A Label-Efficient Training Scheme for Aspect-Based Sentiment Analysis by Using a Pre-Trained Language Model
    Shim, Heereen
    Lowet, Dietwig
    Luca, Stijn
    Vanrumste, Bart
    IEEE ACCESS, 2021, 9 : 115563 - 115578
  • [5] Pre-trained Word Embeddings for Arabic Aspect-Based Sentiment Analysis of Airline Tweets
    Ashi, Mohammed Matuq
    Siddiqui, Muazzam Ahmed
    Nadeem, Farrukh
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT SYSTEMS AND INFORMATICS 2018, 2019, 845 : 241 - 251
  • [6] Incorporating emoji sentiment information into a pre-trained language model for Chinese and English sentiment analysis
    Huang, Jiaming
    Li, Xianyong
    Li, Qizhi
    Du, Yajun
    Fan, Yongquan
    Chen, Xiaoliang
    Huang, Dong
    Wang, Shumin
    Li, Xianyong
    INTELLIGENT DATA ANALYSIS, 2024, 28 (06) : 1601 - 1625
  • [7] SA-ASBA: a hybrid model for aspect-based sentiment analysis using synthetic attention in pre-trained language BERT model with extreme gradient boosting
    Mewada, Arvind
    Dewang, Rupesh Kumar
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (05): : 5516 - 5551
  • [8] Leveraging Pre-trained Language Model for Speech Sentiment Analysis
    Shon, Suwon
    Brusco, Pablo
    Pan, Jing
    Han, Kyu J.
    Watanabe, Shinji
    INTERSPEECH 2021, 2021, : 3420 - 3424
  • [9] AraXLNet: pre-trained language model for sentiment analysis of Arabic
    Alduailej, Alhanouf
    Alothaim, Abdulrahman
    JOURNAL OF BIG DATA, 2022, 9 (01)
  • [10] SA-ASBA: a hybrid model for aspect-based sentiment analysis using synthetic attention in pre-trained language BERT model with extreme gradient boosting
    Arvind Mewada
    Rupesh Kumar Dewang
    The Journal of Supercomputing, 2023, 79 : 5516 - 5551