共 50 条
- [1] Dynamic Knowledge Distillation for Pre-trained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 379 - 389
- [2] Knowledge Base Grounded Pre-trained Language Models via Distillation 39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 1617 - 1625
- [3] Emotional Paraphrasing Using Pre-trained Language Models 2021 9TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2021,
- [4] ReAugKD: Retrieval-Augmented Knowledge Distillation For Pre-trained Language Models 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1128 - 1136
- [6] μBERT: Mutation Testing using Pre-Trained Language Models 2022 IEEE 15TH INTERNATIONAL CONFERENCE ON SOFTWARE TESTING, VERIFICATION AND VALIDATION WORKSHOPS (ICSTW 2022), 2022, : 160 - 169
- [7] Devulgarization of Polish Texts Using Pre-trained Language Models COMPUTATIONAL SCIENCE, ICCS 2022, PT II, 2022, : 49 - 55
- [9] Issue Report Classification Using Pre-trained Language Models 2022 IEEE/ACM 1ST INTERNATIONAL WORKSHOP ON NATURAL LANGUAGE-BASED SOFTWARE ENGINEERING (NLBSE 2022), 2022, : 29 - 32
- [10] Automated Assessment of Inferences Using Pre-Trained Language Models APPLIED SCIENCES-BASEL, 2024, 14 (09):