共 50 条
- [1] ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3350 - 3363
- [2] Explanation Graph Generation via Pre-trained Language Models: An Empirical Study with Contrastive Learning PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1190 - 1208
- [3] Injecting Wiktionary to improve token-level contextual representations using contrastive learning PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 34 - 41
- [4] ContraBERT: Enhancing Code Pre-trained Models via Contrastive Learning 2023 IEEE/ACM 45TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE, 2023, : 2476 - 2487
- [5] Slot Induction via Pre-trained Language Model Probing and Multi-level Contrastive Learning 24TH MEETING OF THE SPECIAL INTEREST GROUP ON DISCOURSE AND DIALOGUE, SIGDIAL 2023, 2023, : 470 - 481
- [6] A Radical-Based Token Representation Method for Enhancing Chinese Pre-Trained Language Models ELECTRONICS, 2025, 14 (05):
- [8] Federated Learning from Pre-Trained Models: A Contrastive Learning Approach ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [9] MIL-Decoding: Detoxifying Language Models at Token-Level via Multiple Instance Learning PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 190 - 202