BERT for Sentiment Analysis: Pre-trained and Fine-Tuned Alternatives

被引:4
作者
Souza, Frederico Dias [1 ]
de Oliveira e Souza Filho, Joao Baptista [1 ]
机构
[1] Univ Fed Rio de Janeiro, Elect Engn Program, Rio De Janeiro, Brazil
来源
COMPUTATIONAL PROCESSING OF THE PORTUGUESE LANGUAGE, PROPOR 2022 | 2022年 / 13208卷
关键词
Sentiment analysis; Natural language processing; Machine learning; Transfer learning; Transformers;
D O I
10.1007/978-3-030-98305-5_20
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
BERT has revolutionized the NLP field by enabling transfer learning with large language models that can capture complex textual patterns, reaching the state-of-the-art for an expressive number of NLP applications. For text classification tasks, BERT has already been extensively explored. However, aspects like how to better cope with the different embeddings provided by the BERT output layer and the usage of language-specific instead of multilingual models are not well studied in the literature, especially for the Brazilian Portuguese language. The purpose of this article is to conduct an extensive experimental study regarding different strategies for aggregating the features produced in the BERT output layer, with a focus on the sentiment analysis task. The experiments include BERT models trained with Brazilian Portuguese corpora and the multilingual version, contemplating multiple aggregation strategies and open-source datasets with predefined training, validation, and test partitions to facilitate the reproducibility of the results. BERT achieved the highest ROC-AUC values for the majority of cases as compared to TF-IDF. Nonetheless, TF-IDF represents a good trade-off between the predictive performance and computational cost.
引用
收藏
页码:209 / 218
页数:10
相关论文
共 24 条
[1]  
[Anonymous], 2018, BRAZILIAN E COMMERCE
[2]  
Carmo D., 2020, PTT5: pretraining and validating the T5 model on Brazilian Portuguese data
[3]  
Carrico N., 2021, IBERSPEECH 2021, P200
[4]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[5]  
Finardi P, ARXIVABS210112015
[6]  
Hartmann N, 2014, P 9 INT C LANG RES E
[7]  
Imgarylai, 2019, imgarylai/bert-embedding
[8]  
Jiang S., 2021, IBERIAN LANGUAGES EV, P891
[9]   An introduction to latent semantic analysis [J].
Landauer, TK ;
Foltz, PW ;
Laham, D .
DISCOURSE PROCESSES, 1998, 25 (2-3) :259-284
[10]  
Lopes Emerson, 2021, INT FLAIRS C, V34, DOI DOI 10.32473/FLAIRS.V34I1.128357