共 50 条
- [1] Regularizing Brain Age Prediction via Gated Knowledge Distillation INTERNATIONAL CONFERENCE ON MEDICAL IMAGING WITH DEEP LEARNING, VOL 172, 2022, 172 : 1430 - 1443
- [5] Regularizing CNN via Feature Augmentation NEURAL INFORMATION PROCESSING (ICONIP 2017), PT II, 2017, 10635 : 325 - 332
- [6] Improving Deep Mutual Learning via Knowledge Distillation APPLIED SCIENCES-BASEL, 2022, 12 (15):
- [8] Logitwise Distillation Network: Improving Knowledge Distillation via Introducing Sample Confidence APPLIED SCIENCES-BASEL, 2025, 15 (05):