共 50 条
- [1] CONVFIT: Conversational Fine-Tuning of Pretrained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1151 - 1168
- [2] Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7870 - 7881
- [4] Fine-Tuning Pretrained Language Models to Enhance Dialogue Summarization in Customer Service Centers PROCEEDINGS OF THE 4TH ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, ICAIF 2023, 2023, : 365 - 373
- [5] Noise-Robust Fine-Tuning of Pretrained Language Models via External Guidance FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12528 - 12540
- [6] Equi-Tuning: Group Equivariant Fine-Tuning of Pretrained Models THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 6788 - 6796
- [7] Prompting or Fine-tuning? A Comparative Study of Large Language Models for Taxonomy Construction 2023 ACM/IEEE INTERNATIONAL CONFERENCE ON MODEL DRIVEN ENGINEERING LANGUAGES AND SYSTEMS COMPANION, MODELS-C, 2023, : 588 - 596
- [8] An Empirical Evaluation of the Zero-Shot, Few-Shot, and Traditional Fine-Tuning Based Pretrained Language Models for Sentiment Analysis in Software Engineering IEEE ACCESS, 2024, 12 : 109714 - 109734