共 50 条
- [21] Using Distillation to Improve Network Performance after Pruning and Quantization PROCEEDINGS OF THE 2019 2ND INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND MACHINE INTELLIGENCE (MLMI 2019), 2019, : 3 - 6
- [23] Model Compression Based on Knowledge Distillation and Its Application in HRRP PROCEEDINGS OF 2020 IEEE 4TH INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2020), 2020, : 1268 - 1272
- [24] Vision Transformer Quantization with Multi-Step Knowledge Distillation SIGNAL PROCESSING, SENSOR/INFORMATION FUSION, AND TARGET RECOGNITION XXXIII, 2024, 13057
- [27] Learning Slimming SSD through Pruning and Knowledge Distillation 2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 2701 - 2705
- [28] Automatic detection of structural defects in tunnel lining via network pruning and knowledge distillation in YOLO STRUCTURAL HEALTH MONITORING-AN INTERNATIONAL JOURNAL, 2024,
- [30] Efficient Neural Data Compression for Machine Type Communications via Knowledge Distillation 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1169 - 1174