共 33 条
- [1] Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III, 2021, 12893 : 570 - 581
- [3] Harnessing the Power of Prompt Experts: Efficient Knowledge Distillation for Enhanced Language Understanding MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK AND DEMO TRACK, PT VIII, ECML PKDD 2024, 2024, 14948 : 218 - 234
- [4] E2VPT: An Effective and Efficient Approach for Visual Prompt Tuning 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 17445 - 17456
- [5] SMoP: Towards Efficient and Effective Prompt Tuning with Sparse Mixture-of-Prompts 2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 14306 - 14316
- [7] Effective and efficient conditional contrast for data-free knowledge distillation with low memory JOURNAL OF SUPERCOMPUTING, 2025, 81 (04):
- [9] AN EFFICIENT METHOD FOR MODEL PRUNING USING KNOWLEDGE DISTILLATION WITH FEW SAMPLES 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 2515 - 2519
- [10] NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge Distillation 2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 23860 - 23869