共 50 条
- [2] Identifying Valid User Stories Using BERT Pre-trained Natural Language Models INFORMATION SYSTEMS AND TECHNOLOGIES, VOL 3, WORLDCIST 2023, 2024, 801 : 167 - 177
- [3] A Comparative Study on Pre-Trained Models Based on BERT 2024 6TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING, ICNLP 2024, 2024, : 326 - 330
- [4] Emotional Paraphrasing Using Pre-trained Language Models 2021 9TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2021,
- [5] BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 2281 - 2290
- [9] Devulgarization of Polish Texts Using Pre-trained Language Models COMPUTATIONAL SCIENCE, ICCS 2022, PT II, 2022, : 49 - 55
- [10] MERGEDISTILL: Merging Pre-trained Language Models using Distillation FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2874 - 2887