共 50 条
- [23] One-Step Knowledge Distillation and Fine-Tuning in Using Large Pre-Trained Self-Supervised Learning Models for Speaker Verification INTERSPEECH 2023, 2023, : 5271 - 5275
- [24] Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 553 - 565
- [26] DISCOVER THE EFFECTIVE STRATEGY FOR FACE RECOGNITION MODEL COMPRESSION BY IMPROVED KNOWLEDGE DISTILLATION 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 2416 - 2420
- [27] PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation INTERSPEECH 2021, 2021, : 4568 - 4572
- [28] Legal judgment prediction based on pre-training model and knowledge distillation Kongzhi yu Juece/Control and Decision, 2021, 37 (01): : 67 - 76
- [29] Knowledge Distillation Approach for Efficient Internal Language Model Estimation INTERSPEECH 2023, 2023, : 1339 - 1343
- [30] Ensemble Compressed Language Model Based on Knowledge Distillation and Multi-Task Learning 2022 7TH INTERNATIONAL CONFERENCE ON BUSINESS AND INDUSTRIAL RESEARCH (ICBIR2022), 2022, : 72 - 77