共 30 条
[1]
Knowledge distillation: A good teacher is patient and consistent
[J].
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR),
2022,
:10915-10924
[2]
Distilling Knowledge via Knowledge Review
[J].
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021,
2021,
:5006-5015
[3]
On the Efficacy of Knowledge Distillation
[J].
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019),
2019,
:4793-4801
[4]
Furlanello Tommaso, 2018, INT C MACHINE LEARNI, P1607
[6]
Class Attention Transfer Based Knowledge Distillation
[J].
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR),
2023,
:11868-11877
[7]
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
[J].
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV),
2015,
:1026-1034
[8]
Deep Residual Learning for Image Recognition
[J].
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR),
2016,
:770-778
[9]
Hinton G, 2015, Arxiv, DOI arXiv:1503.02531
[10]
Hu J, 2018, PROC CVPR IEEE, P7132, DOI [10.1109/TPAMI.2019.2913372, 10.1109/CVPR.2018.00745]