共 50 条
- [41] Model Selection - Knowledge Distillation Framework for Model Compression 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
- [45] Triplet Knowledge Distillation Networks for Model Compression 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
- [46] Analysis of Model Compression Using Knowledge Distillation IEEE ACCESS, 2022, 10 : 85095 - 85105
- [48] Compression of Acoustic Model via Knowledge Distillation and Pruning 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2785 - 2790
- [49] Enhancing Global Model Performance in Federated Learning With Non-IID Data Using a Data-Free Generative Diffusion Model IEEE ACCESS, 2024, 12 : 148230 - 148239