Domain Adaptation of Transformer-Based Models Using Unlabeled Data for Relevance and Polarity Classification of German Customer Feedback

被引:0
|
作者
Idrissi-Yaghir A. [1 ,3 ]
Schäfer H. [1 ,2 ]
Bauer N. [1 ]
Friedrich C.M. [1 ,3 ]
机构
[1] Department of Computer Science, University of Applied Sciences and Arts Dortmund (FHDO), Emil-Figge Str. 42, Dortmund
[2] Institute for Transfusion Medicine, University Hospital Essen, Hufelandstraße 55, Essen
[3] Institute for Medical Informatics, Biometry and Epidemiology (IMIBE), University Hospital Essen, Hufelandstraße 55, Essen
关键词
Domain adaptation; Sentiment analysis; Text classification; Transformer-based models;
D O I
10.1007/s42979-022-01563-6
中图分类号
学科分类号
摘要
Understanding customer feedback is becoming a necessity for companies to identify problems and improve their products and services. Text classification and sentiment analysis can play a major role in analyzing this data by using a variety of machine and deep learning approaches. In this work, different transformer-based models are utilized to explore how efficient these models are when working with a German customer feedback dataset. In addition, these pre-trained models are further analyzed to determine if adapting them to a specific domain using unlabeled data can yield better results than off-the-shelf pre-trained models. To evaluate the models, two downstream tasks from the GermEval 2017 are considered. The experimental results show that transformer-based models can reach significant improvements compared to a fastText baseline and outperform the published scores and previous models. For the subtask Relevance Classification, the best models achieve a micro-averaged F1-Score of 96.1 % on the first test set and 95.9 % on the second one, and a score of 85.1 % and 85.3 % for the subtask Polarity Classification. © 2023, The Author(s).
引用
收藏
相关论文
共 5 条
  • [1] Transformer-Based Multi-Source Domain Adaptation Without Source Data
    Li, Gang
    Wu, Chao
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [2] MI-CAT: A transformer-based domain adaptation network for motor imagery classification
    Zhang, Dongxue
    Li, Huiying
    Xie, Jingmeng
    NEURAL NETWORKS, 2023, 165 : 451 - 462
  • [3] Automated genre-based multi-domain sentiment lexicon adaptation using unlabeled data
    Sanagar, Swati
    Gupta, Deepa
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2020, 38 (05) : 6223 - 6234
  • [4] Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
    Agbesi, Victor Kwaku
    Chen, Wenyu
    Yussif, Sophyani Banaamwini
    Hossin, Md Altab
    Ukwuoma, Chiagoziem C.
    Kuadey, Noble A.
    Agbesi, Colin Collinson
    Samee, Nagwan Abdel
    Jamjoom, Mona M.
    Al-antari, Mugahed A.
    SYSTEMS, 2024, 12 (01):
  • [5] Transformer-Based Water Stress Estimation Using Leaf Wilting Computed from Leaf Images and Unsupervised Domain Adaptation for Tomato Crops
    Koike, Makoto
    Onuma, Riku
    Adachi, Ryo
    Mineno, Hiroshi
    TECHNOLOGIES, 2024, 12 (07)