共 50 条
- [23] Continual Learning for Neural Machine Translation 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 3964 - 3974
- [24] Reinforced Multi-teacher Knowledge Distillation for Unsupervised Sentence Representation ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 320 - 332
- [26] MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution COMPUTER VISION - ECCV 2024, PT XXXIX, 2025, 15097 : 364 - 382
- [29] mKDNAD: A network flow anomaly detection method based on multi-teacher knowledge distillation 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 314 - 319
- [30] Continual Learning with Semi-supervised Contrastive Distillation for Incremental Neural Machine Translation PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 10914 - 10928