共 50 条
- [31] Improving the Interpretability of Deep Neural Networks with Knowledge Distillation 2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 905 - 912
- [33] Knowledge Distillation via Channel Correlation Structure KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 357 - 368
- [34] Ensembled CTR Prediction via Knowledge Distillation CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2941 - 2948
- [36] Utilizing Video Word Boundaries and Feature-Based Knowledge Distillation Improving Sentence-Level Lip Reading PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VI, 2024, 14430 : 269 - 281
- [38] Contrastive Knowledge Distillation Method Based on Feature Space Embedding Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2023, 51 (05): : 13 - 23
- [40] Knowledge Distillation for Tiny Speech Enhancement with Latent Feature Augmentation INTERSPEECH 2024, 2024, : 652 - 656