共 50 条
- [1] Triplet Knowledge Distillation Networks for Model Compression 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
- [2] Analysis of Model Compression Using Knowledge Distillation IEEE ACCESS, 2022, 10 : 85095 - 85105
- [3] Compression of Acoustic Model via Knowledge Distillation and Pruning 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2785 - 2790
- [5] Model Compression Based on Knowledge Distillation and Its Application in HRRP PROCEEDINGS OF 2020 IEEE 4TH INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2020), 2020, : 1268 - 1272
- [9] Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 553 - 565