共 50 条
- [1] Bridging Pre-trained Models and Downstream Tasks for Source Code Understanding 2022 ACM/IEEE 44TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE 2022), 2022, : 287 - 298
- [3] Quantifying Adaptability in Pre-trained Language Models with 500 Tasks NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 4696 - 4715
- [4] VLATTACK: Multimodal Adversarial Attacks on Vision-Language Tasks via Pre-trained Models ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [5] Pre-trained Language Model Representations for Language Generation 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4052 - 4059
- [6] On the Language Neutrality of Pre-trained Multilingual Representations FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1663 - 1674
- [7] Compression of Generative Pre-trained Language Models via Quantization PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4821 - 4836
- [8] Voting from Nearest Tasks: Meta-Vote Pruning of Pre-trained Models for Downstream Tasks MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT II, 2023, 14170 : 52 - 68
- [10] Adopting Pre-trained Large Language Models for Regional Language Tasks: A Case Study INTELLIGENT HUMAN COMPUTER INTERACTION, IHCI 2023, PT I, 2024, 14531 : 15 - 25