共 45 条
[1]
Conneau A., Khandelwal K., Goyal N., Chaudhary V., Wenzek G., Guzman F., Grave E., Ott M., Zettlemoyer L., Stoyanov V., Unsupervised cross-lingual representation learning at scale, (2020)
[2]
Ai M., BERT for Russian news clustering, Proceedings of the International Conference “Dialogue 2021, (2021)
[3]
Radford A., Narasimhan K., Salimans T., Sutskever I., Improving language understanding by generative pre-training, (2018)
[4]
Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., Gomez A. N., Kaiser L. u., Polosukhin I., Attention is all you need, Advances in Neural Information Processing Systems, 30, pp. 5998-6008, (2017)
[5]
Radford Alec, Wu Jeffrey, Child Rewon, Luan David, Amodei Dario, Sutskever Ilya, Language models are unsupervised multitask learners, (2019)
[6]
Alvarez-Carmona M. A., Lopez-Monroy A. P., Montes- y Gomez M., Villase nor-Pineda L., Meza I., Evaluating topic-based representations for author profiling in social media, (2016)
[7]
Antoun Wissam, Baly Fady, Hajj Hazem, Arabert: Transformer-based model for arabic language understanding, (2020)
[8]
Bernard G., Resources to compute TF-IDF weightings on press articles and tweets, (2022)
[9]
Bassem B., Zrigui M., Gender identification: a comparative study of deep learning architectures, Intelligent Systems Design and Applications: 18th International Conference on Intelligent Systems Design and Applications (ISDA 2018) held in Vellore, 2, pp. 792-800, (2020)
[10]
Butt S., Ashraf N., Sidorov G., Gelbukh A. F., Sexism Identification using BERT and Data Augmentation-EXIST2021, IberLEF@ SEPLN, pp. 381-389, (2021)