共 50 条
[1]
Acheampong FA, Nunoo-Mensah H, Chen W., Transformer models for text-based emotion detection: a review of BERT-based approaches, Artificial Intelligence Review, 54, pp. 3311-5829, (2021)
[2]
Al-Yahya M, Al-Khalifa H, Al-Baity H, Alsaeed D, Essam A., Arabic fake news detection: comparative study of neural networks and transformer-based approaches, Complexity, 2021, pp. 1-10, (2021)
[3]
Ayoub J, Yang XJ, Zhou F., Combat COVID-19 infodemic using explainable natural language processing models, Information Processing and Management, 58, 4, (2021)
[4]
Bagal V, Aggarwal R, Vinod PK, Priyakumar UD., MolGPT: molecular generation using a transformer-decoder model, Journal of Chemical Information and Modeling, 62, 9, pp. 2064-2076, (2022)
[5]
Bakker C, Theis-Mahon N, Brown SJ., Evaluating the accuracy of scite, a smart citation index, Hypothesis: Research Journal for Health Information Professionals, 35, 2, (2023)
[6]
Balagopalan A, Eyre B, Rudzicz F, Novikova J., To BERT or not to BERT: comparing speech and language-based approaches for Alzheimer’s disease detection, Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, pp. 2167-2171, (2020)
[7]
Chang W-C, Yu H-F, Zhong K, Yang Y, Dhillon IS., Taming pretrained transformers for extreme multi-label text classification, KDD'20: Proceedings of the 26TH ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 3163-3171, (2020)
[8]
Colon-Ruiz C, Segura-Bedmar I., Comparing deep learning architectures for sentiment analysis on drug reviews, Journal of Biomedical Informatics, 110, (2020)
[9]
Devlin J, Chang M-W, Lee K, Toutanova KN., BERT: pre-training of deep bidirectional transformers for language understanding, (2018)
[10]
Dhar S, Shamir L., Evaluation of the benchmark datasets for testing the efficacy of deep convolutional neural networks, Visual Informatics, 5, 3, pp. 92-101, (2021)