共 50 条
- [21] Improving Neural Topic Models with Wasserstein Knowledge Distillation ADVANCES IN INFORMATION RETRIEVAL, ECIR 2023, PT II, 2023, 13981 : 321 - 330
- [22] Compact Models for Periocular Verification Through Knowledge Distillation 2020 INTERNATIONAL CONFERENCE OF THE BIOMETRICS SPECIAL INTEREST GROUP (BIOSIG), 2020, P-306
- [23] AN INVESTIGATION OF A KNOWLEDGE DISTILLATION METHOD FOR CTC ACOUSTIC MODELS 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 5809 - 5813
- [24] EFFICIENT KNOWLEDGE DISTILLATION FOR RNN-TRANSDUCER MODELS 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5639 - 5643
- [25] New estimation and feature selection methods in mixture-of-experts models CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2010, 38 (04): : 519 - 539
- [26] Mixture of Experts for Intelligent Networks: A Large Language Model-enabled Approach 20TH INTERNATIONAL WIRELESS COMMUNICATIONS & MOBILE COMPUTING CONFERENCE, IWCMC 2024, 2024, : 531 - 536
- [29] SpeechMoE: Scaling to Large Acoustic Models with Dynamic Routing Mixture of Experts INTERSPEECH 2021, 2021, : 2077 - 2081