共 50 条
- [3] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning Applied Intelligence, 2023, 53 : 28520 - 28541
- [5] Self-Supervised Contrastive Learning for Camera-to-Radar Knowledge Distillation 2024 20TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING IN SMART SYSTEMS AND THE INTERNET OF THINGS, DCOSS-IOT 2024, 2024, : 154 - 161
- [8] SKILL: SIMILARITY-AWARE KNOWLEDGE DISTILLATION FOR SPEECH SELF-SUPERVISED LEARNING 2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW 2024, 2024, : 675 - 679
- [9] A Semi-Supervised Federated Learning Scheme via Knowledge Distillation for Intrusion Detection IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2688 - 2693