Text Classification with Transformers and Reformers for Deep Text Data

被引:0
|
作者
Soleymani, Roghayeh [1 ]
Farret, Jeremie [1 ]
机构
[1] Inmind Technol Inc, Montreal, PQ, Canada
来源
PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON ADVANCES IN SIGNAL PROCESSING AND ARTIFICIAL INTELLIGENCE, ASPAI' 2020 | 2020年
关键词
Natural language processing; Text classification; Transformers; Reformers; Trax; Mind in a box;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state of the art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with three different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformer in terms of accuracy and training speed for text classification. However, Reformers allow to train bigger models which cause memory failure for Transformers.
引用
收藏
页码:239 / 243
页数:5
相关论文
共 50 条
  • [1] Data Augmentation with Transformers for Text Classification
    Medardo Tapia-Tellez, Jose
    Jair Escalante, Hugo
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, MICAI 2020, PT II, 2020, 12469 : 247 - 259
  • [2] Data Augmentation Using Transformers and Similarity Measures for Improving Arabic Text Classification
    Refai, Dania
    Abu-Soud, Saleh
    Abdel-Rahman, Mohammad J.
    IEEE ACCESS, 2023, 11 : 132516 - 132531
  • [3] Limitations of Transformers on Clinical Text Classification
    Gao, Shang
    Alawad, Mohammed
    Young, M. Todd
    Gounley, John
    Schaefferkoetter, Noah
    Yoon, Hong Jun
    Wu, Xiao-Cheng
    Durbin, Eric B.
    Doherty, Jennifer
    Stroup, Antoinette
    Coyle, Linda
    Tourassi, Georgia
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2021, 25 (09) : 3596 - 3607
  • [4] Improving text classification with transformers and layer normalization
    Rodrawangpai, Ben
    Daungjaiboon, Witawat
    MACHINE LEARNING WITH APPLICATIONS, 2022, 10
  • [5] Using of Transformers Models for Text Classification to Mobile Educational Applications
    Garrido, Anabel Pilicita
    Arias, Enrique Barra
    IEEE LATIN AMERICA TRANSACTIONS, 2023, 21 (06) : 730 - 736
  • [6] Data Mining in Clinical Trial Text: Transformers for Classification and Question Answering Tasks
    Schmidt, Lena
    Weeds, Julie
    Higgins, Julian P. T.
    PROCEEDINGS OF THE 13TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES, VOL 5: HEALTHINF, 2020, : 83 - 94
  • [7] Deep Reinforcement Learning with Transformers for Text Adventure Games
    Xu, Yunqiu
    Chen, Ling
    Fang, Meng
    Wang, Yang
    Zhang, Chengqi
    2020 IEEE CONFERENCE ON GAMES (IEEE COG 2020), 2020, : 65 - 72
  • [8] Hierarchical Data Augmentation and the Application in Text Classification
    Yu, Shujuan
    Yang, Jie
    Liu, Danlei
    Li, Runqi
    Zhang, Yun
    Zhao, Shengmei
    IEEE ACCESS, 2019, 7 : 185476 - 185485
  • [9] Deep Active Learning for Text Classification
    An, Bang
    Wu, Wenjun
    Han, Huimin
    PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING (ICVISP 2018), 2018,
  • [10] DGRL: Text Classification with Deep Graph Residual Learning
    Chen, Boyan
    Lu, Guangquan
    Peng, Bo
    Zhang, Wenzhen
    ADVANCED DATA MINING AND APPLICATIONS, 2020, 12447 : 83 - 97