共 50 条
- [41] Contrastive Knowledge Distillation Method Based on Feature Space Embedding Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2023, 51 (05): : 13 - 23
- [44] A Malware Classification Method Based on Knowledge Distillation and Feature Fusion IEEE ACCESS, 2025, 13 : 51268 - 51276
- [45] MULTICHANNEL ASR WITH KNOWLEDGE DISTILLATION AND GENERALIZED CROSS CORRELATION FEATURE 2018 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY (SLT 2018), 2018, : 463 - 469
- [46] Knowledge Distillation for Tiny Speech Enhancement with Latent Feature Augmentation INTERSPEECH 2024, 2024, : 652 - 656
- [49] Improving the Consistency of Semantic Parsing in KBQA Through Knowledge Distillation WEB AND BIG DATA, PT III, APWEB-WAIM 2023, 2024, 14333 : 373 - 388