共 56 条
- [1] Ahn S., 2019, Variational information distillation for knowledge transfer, P9155, DOI DOI 10.1109/CVPR.2019.00938
- [2] Allen-Zhu Z, 2023, arXiv, DOI arXiv:2012.09816
- [3] Anil R, 2020, Arxiv, DOI arXiv:1804.03235
- [4] Knowledge distillation: A good teacher is patient and consistent [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10915 - 10924
- [5] Bucila C., 2006, P ACM SIGKDD INT C K, P535
- [6] Chen DF, 2020, AAAI CONF ARTIF INTE, V34, P3430
- [7] Distilling Knowledge via Knowledge Review [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 5006 - 5015
- [8] Chen T., 2020, Adv. Neural Inf. Process. Syst., V33, P22243
- [9] Chi Z., LNCS, V12372, P107
- [10] Chi Z., 2024, 12 INT C LEARN REPR