共 50 条
- [21] Model compression via pruning and knowledge distillation for person re-identification Journal of Ambient Intelligence and Humanized Computing, 2021, 12 : 2149 - 2161
- [22] AUGMENTING KNOWLEDGE DISTILLATION WITH PEER-TO-PEER MUTUAL LEARNING FOR MODEL COMPRESSION 2022 IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (IEEE ISBI 2022), 2022,
- [24] Attention-Fused CNN Model Compression with Knowledge Distillation for Brain Tumor Segmentation MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2022, 2022, 13413 : 328 - 338
- [25] The Optimization Method of Knowledge Distillation Based on Model Pruning 2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 1386 - 1390
- [26] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [29] On-Demand Deep Model Compression for Mobile Devices: A Usage-Driven Model Selection Framework MOBISYS'18: PROCEEDINGS OF THE 16TH ACM INTERNATIONAL CONFERENCE ON MOBILE SYSTEMS, APPLICATIONS, AND SERVICES, 2018, : 389 - 400