共 50 条
- [1] Online Knowledge Distillation for Multi-task Learning 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2358 - 2367
- [2] Multi-Task Learning with Knowledge Distillation for Dense Prediction 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 21493 - 21502
- [3] Application of Knowledge Distillation to Multi-Task Speech Representation Learning INTERSPEECH 2023, 2023, : 2813 - 2817
- [4] On Partial Multi-Task Learning ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1174 - 1181
- [5] Multi-Task Feature Learning for Knowledge Graph Enhanced Recommendation WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, : 2000 - 2010
- [6] Tomato leaf disease recognition based on multi-task distillation learning FRONTIERS IN PLANT SCIENCE, 2024, 14
- [7] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING 2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393
- [8] Cross-Task Knowledge Distillation in Multi-Task Recommendation THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4318 - 4326