共 27 条
- [1] Fast and accurate image retrieval using knowledge distillation from multiple deep pre-trained networks Multimedia Tools and Applications, 2023, 82 : 33937 - 33959
- [3] KNOWLEDGE DISTILLATION FOR NEURAL TRANSDUCERS FROM LARGE SELF-SUPERVISED PRE-TRAINED MODELS 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8527 - 8531
- [4] AdaDS: Adaptive data selection for accelerating pre-trained language model knowledge distillation AI OPEN, 2023, 4 : 56 - 63
- [5] Grand: A Fast and Accurate Graph Retrieval Framework via Knowledge Distillation PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 1639 - 1648
- [6] Knowledge Transfer from Pre-trained Language Models to Cif-based Speech Recognizers via Hierarchical Distillation INTERSPEECH 2023, 2023, : 1364 - 1368
- [8] Deep Learning of Pre-Classification for Fast Image Retrieval 2018 INTERNATIONAL CONFERENCE ON ALGORITHMS, COMPUTING AND ARTIFICIAL INTELLIGENCE (ACAI 2018), 2018,
- [9] Fast and Accurate Facial Expression Image Classification and Regression Method Based on Knowledge Distillation APPLIED SCIENCES-BASEL, 2023, 13 (11):
- [10] One-Step Knowledge Distillation and Fine-Tuning in Using Large Pre-Trained Self-Supervised Learning Models for Speaker Verification INTERSPEECH 2023, 2023, : 5271 - 5275