共 50 条
- [2] Character-Level Syntax Infusion in Pre-Trained Models for Chinese Semantic Role Labeling International Journal of Machine Learning and Cybernetics, 2021, 12 : 3503 - 3515
- [6] Natural Attack for Pre-trained Models of Code 2022 ACM/IEEE 44TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE 2022), 2022, : 1482 - 1493
- [7] HinPLMs: Pre-trained Language Models for Hindi 2021 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2021, : 241 - 246
- [8] Disambiguating Clinical Abbreviations using Pre-trained Word Embeddings HEALTHINF: PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES - VOL. 5: HEALTHINF, 2021, : 501 - 508
- [10] Compressing Pre-trained Models of Code into 3 MB PROCEEDINGS OF THE 37TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE 2022, 2022,