A Study on Performance Enhancement by Integrating Neural Topic Attention with Transformer-Based Language Model

被引:1
|
作者
Um, Taehum [1 ]
Kim, Namhyoung [1 ]
机构
[1] Gachon Univ, Dept Appl Stat, 1342 Seongnam Daero, Seongnam 13120, South Korea
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 17期
基金
新加坡国家研究基金会;
关键词
natural language processing; neural topic model; ELECTRA; ALBERT; multi-classification;
D O I
10.3390/app14177898
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
As an extension of the transformer architecture, the BERT model has introduced a new paradigm for natural language processing, achieving impressive results in various downstream tasks. However, high-performance BERT-based models-such as ELECTRA, ALBERT, and RoBERTa-suffer from limitations such as poor continuous learning capability and insufficient understanding of domain-specific documents. To address these issues, we propose the use of an attention mechanism to combine BERT-based models with neural topic models. Unlike traditional stochastic topic modeling, neural topic modeling employs artificial neural networks to learn topic representations. Furthermore, neural topic models can be integrated with other neural models and trained to identify latent variables in documents, thereby enabling BERT-based models to sufficiently comprehend the contexts of specific fields. We conducted experiments on three datasets-Movie Review Dataset (MRD), 20Newsgroups, and YELP-to evaluate our model's performance. Compared to the vanilla model, the proposed model achieved an accuracy improvement of 1-2% for the ALBERT model in multiclassification tasks across all three datasets, while the ELECTRA model showed an accuracy improvement of less than 1%.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Empirical Study of Tweets Topic Classification Using Transformer-Based Language Models
    Mandal, Ranju
    Chen, Jinyan
    Becken, Susanne
    Stantic, Bela
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2021, 2021, 12672 : 340 - 350
  • [2] Is Transformer-Based Attention Agnostic of the Pretraining Language and Task?
    Martin, R. H. J.
    Visser, R.
    Dunaiski, M.
    SOUTH AFRICAN COMPUTER SCIENCE AND INFORMATION SYSTEMS RESEARCH TRENDS, SAICSIT 2024, 2024, 2159 : 95 - 123
  • [3] Tweets Topic Classification and Sentiment Analysis Based on Transformer-Based Language Models
    Mandal, Ranju
    Chen, Jinyan
    Becken, Susanne
    Stantic, Bela
    VIETNAM JOURNAL OF COMPUTER SCIENCE, 2023, 10 (02) : 117 - 134
  • [4] Automatic assessment of divergent thinking in Chinese language with TransDis: A transformer-based language model approach
    Yang, Tianchen
    Zhang, Qifan
    Sun, Zhaoyang
    Hou, Yubo
    BEHAVIOR RESEARCH METHODS, 2024, 56 (06) : 5798 - 5819
  • [5] Transformer-Based Deep Neural Language Modeling for Construct-Specific Automatic Item Generation
    Hommel, Bjoern E.
    Wollang, Franz-Josef M.
    Kotova, Veronika
    Zacher, Hannes
    Schmukle, Stefan C.
    PSYCHOMETRIKA, 2022, 87 (02) : 749 - 772
  • [6] Transformer-Based Deep Neural Language Modeling for Construct-Specific Automatic Item Generation
    Björn E. Hommel
    Franz-Josef M. Wollang
    Veronika Kotova
    Hannes Zacher
    Stefan C. Schmukle
    Psychometrika, 2022, 87 : 749 - 772
  • [7] On compositional generalization of transformer-based neural machine translation
    Yin, Yongjing
    Fu, b Lian
    Li, Yafu
    Zhang, Yue
    INFORMATION FUSION, 2024, 111
  • [8] Integrating structured and unstructured data for predicting emergency severity: an association and predictive study using transformer-based natural language processing models
    Zhang, Xingyu
    Wang, Yanshan
    Jiang, Yun
    Pacella, Charissa B.
    Zhang, Wenbin
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2024, 24 (01)
  • [9] A Transformer-based Approach for Translating Natural Language to Bash Commands
    Fu, Quchen
    Teng, Zhongwei
    White, Jules
    Schmidt, Douglas C.
    20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 1245 - 1248
  • [10] Smart Home Notifications in Croatian Language: A Transformer-Based Approach
    Simunec, Magdalena
    Soic, Renato
    2023 17TH INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS, CONTEL, 2023,