共 50 条
- [31] Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models APPLIED SCIENCES-BASEL, 2024, 14 (20):
- [33] MoEVC: A Mixture of Experts Voice Conversion System With Sparse Gating Mechanism for Online Computation Acceleration 2021 12TH INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING (ISCSLP), 2021,
- [35] Facilitating Pornographic Text Detection for Open-Domain Dialogue Systems via Knowledge Distillation of Large Language Models PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 2313 - 2318
- [36] KNOWLEDGE DISTILLATION FOR RECURRENT NEURAL NETWORK LANGUAGE MODELING WITH TRUST REGULARIZATION 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7230 - 7234
- [38] Weather recognition combining improved ConvNeXt models with knowledge distillation Guangxue Jingmi Gongcheng/Optics and Precision Engineering, 2023, 31 (14): : 2123 - 2134
- [39] DOMAIN ADAPTATION OF DNN ACOUSTIC MODELS USING KNOWLEDGE DISTILLATION 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 5185 - 5189
- [40] Model-Agnostic Knowledge Distillation Between Heterogeneous Models NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, NLPCC 2024, 2025, 15359 : 245 - 257