共 25 条
- [1] Multi-student Collaborative Self-supervised Distillation ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, ICIC 2023, PT II, 2023, 14087 : 199 - 210
- [2] Privacy-Preserving Student Learning with Differentially Private Data-Free Distillation 2022 IEEE 24TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), 2022,
- [4] ROBUSTNESS AND DIVERSITY SEEKING DATA-FREE KNOWLEDGE DISTILLATION 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2740 - 2744
- [6] Data-free Knowledge Distillation for Reusing Recommendation Models PROCEEDINGS OF THE 17TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2023, 2023, : 386 - 395
- [7] Double-Generators Network for Data-Free Knowledge Distillation Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (07): : 1615 - 1627
- [10] Dual discriminator adversarial distillation for data-free model compression International Journal of Machine Learning and Cybernetics, 2022, 13 : 1213 - 1230