共 50 条
- [21] Improving Quality Estimation of Machine Translation by Using Pre-trained Language Representation MACHINE TRANSLATION, CCMT 2019, 2019, 1104 : 11 - 22
- [22] Unsupervised Representation Learning from Pre-trained Diffusion Probabilistic Models ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [23] TOKEN Is a MASK: Few-shot Named Entity Recognition with Pre-trained Language Models TEXT, SPEECH, AND DIALOGUE (TSD 2022), 2022, 13502 : 138 - 150
- [24] CokeBERT: Contextual knowledge selection and embedding towards enhanced pre-trained language models AI OPEN, 2021, 2 : 127 - 134
- [25] Knowledge Base Grounded Pre-trained Language Models via Distillation 39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 1617 - 1625
- [27] Improving the Reusability of Pre-trained Language Models in Real-world Applications 2023 IEEE 24TH INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE, IRI, 2023, : 40 - 45
- [28] Improving Gender Fairness of Pre-Trained Language Models without Catastrophic Forgetting 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1249 - 1262
- [29] Rewire-then-Probe: A Contrastive Recipe for Probing Biomedical Knowledge of Pre-trained Language Models PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4798 - 4810
- [30] Fact-Checking the Output of Large Language Models via Token-Level Uncertainty Quantification FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 9367 - 9385