共 50 条
- [41] DISCOVER THE EFFECTIVE STRATEGY FOR FACE RECOGNITION MODEL COMPRESSION BY IMPROVED KNOWLEDGE DISTILLATION 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 2416 - 2420
- [42] AUGMENTING KNOWLEDGE DISTILLATION WITH PEER-TO-PEER MUTUAL LEARNING FOR MODEL COMPRESSION 2022 IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (IEEE ISBI 2022), 2022,
- [43] TGNet: A Lightweight Infrared Thermal Image Gesture Recognition Network Based on Knowledge Distillation and Model Pruning 2024 CROSS STRAIT RADIO SCIENCE AND WIRELESS TECHNOLOGY CONFERENCE, CSRSWTC 2024, 2024, : 96 - 98
- [46] Attention-Fused CNN Model Compression with Knowledge Distillation for Brain Tumor Segmentation MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2022, 2022, 13413 : 328 - 338
- [47] IoT Device Friendly and Communication-Efficient Federated Learning via Joint Model Pruning and Quantization IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (15): : 13638 - 13650
- [48] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [50] Knowledge Distillation via Information Matching NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 405 - 417