共 50 条
[41]
Cold-started Curriculum Learning for Task-oriented Dialogue Policy
[J].
2021 IEEE INTERNATIONAL CONFERENCE ON E-BUSINESS ENGINEERING (ICEBE 2021),
2021,
:100-105
[44]
Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation
[J].
2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE,
2023,
:207-208
[46]
Improving Deep Mutual Learning via Knowledge Distillation
[J].
APPLIED SCIENCES-BASEL,
2022, 12 (15)
[48]
Learning Slimming SSD through Pruning and Knowledge Distillation
[J].
2019 CHINESE AUTOMATION CONGRESS (CAC2019),
2019,
:2701-2705
[49]
Representation Learning and Knowledge Distillation for Lightweight Domain Adaptation
[J].
2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024,
2024,
:1202-1207
[50]
VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning
[J].
IEEE ACCESS,
2023, 11
:51229-51240