共 50 条
- [31] Named Entity Recognition Method Based on Multi-Teacher Collaborative Cyclical Knowledge Distillation PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 230 - 235
- [32] MTKDSR: Multi-Teacher Knowledge Distillation for Super Resolution Image Reconstruction 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 352 - 358
- [33] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [34] MulDE: Multi-teacher Knowledge Distillation for Low-dimensional Knowledge Graph Embeddings PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1716 - 1726
- [37] Dual Knowledge Distillation for neural machine translation COMPUTER SPEECH AND LANGUAGE, 2024, 84
- [38] Continual Learning Based on Knowledge Distillation and Representation Learning ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
- [39] Learning Semantic Textual Similarity via Multi-Teacher Knowledge Distillation: A Multiple Data Augmentation method 2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 1197 - 1203
- [40] CIMTD: Class Incremental Multi-Teacher Knowledge Distillation for Fractal Object Detection PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT XII, 2025, 15042 : 51 - 65