共 50 条
- [1] In-Context Analogical Reasoning with Pre-Trained Language Models PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 1953 - 1969
- [2] Making Pre-trained Language Models Better Few-shot Learners 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3816 - 3830
- [3] Investigating Prompt Learning for Chinese Few-Shot Text Classification with Pre-Trained Language Models APPLIED SCIENCES-BASEL, 2022, 12 (21):
- [4] Few-Shot NLG with Pre-Trained Language Model 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 183 - 190
- [5] Pathologies of Pre-trained Language Models in Few-shot Fine-tuning PROCEEDINGS OF THE THIRD WORKSHOP ON INSIGHTS FROM NEGATIVE RESULTS IN NLP (INSIGHTS 2022), 2022, : 144 - 153
- [6] Extracting Business Process Entities and Relations from Text Using Pre-trained Language Models and In-Context Learning ENTERPRISE DESIGN, OPERATIONS, AND COMPUTING, EDOC 2022, 2022, 13585 : 182 - 199
- [7] TOKEN Is a MASK: Few-shot Named Entity Recognition with Pre-trained Language Models TEXT, SPEECH, AND DIALOGUE (TSD 2022), 2022, 13502 : 138 - 150
- [8] Defending Pre-trained Language Models as Few-shot Learners against Backdoor Attacks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [9] Context Analysis for Pre-trained Masked Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 3789 - 3804