共 50 条
- [34] Two-Stage Edge-Side Fault Diagnosis Method Based on Double Knowledge Distillation CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 76 (03): : 3623 - 3651
- [35] Personalized Federated Learning Method Based on Collation Game and Knowledge Distillation Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2023, 45 (10): : 3702 - 3709
- [38] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [39] Model-based Federated Reinforcement Distillation 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1109 - 1114
- [40] FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning IEEE ACCESS, 2023, 11 : 72409 - 72417