共 50 条
- [21] DISCOVER THE EFFECTIVE STRATEGY FOR FACE RECOGNITION MODEL COMPRESSION BY IMPROVED KNOWLEDGE DISTILLATION 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 2416 - 2420
- [24] Multi-Domain Lifelong Visual Question Answering via Self-Critical Distillation PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 4747 - 4758
- [25] PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation INTERSPEECH 2021, 2021, : 4568 - 4572
- [28] Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer’s Disease Neuroinformatics, 2017, 15 : 115 - 132