共 28 条
[1]
Variational Information Distillation for Knowledge Transfer
[J].
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019),
2019,
:9155-9163
[2]
Chen DF, 2021, AAAI CONF ARTIF INTE, V35, P7028
[3]
Chen DF, 2020, AAAI CONF ARTIF INTE, V34, P3430
[4]
EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization
[J].
2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023),
2023,
:17030-17040
[5]
Dong PJ, 2024, Arxiv, DOI [arXiv:2402.02105, 10.48550/arXiv.2402.02105, DOI 10.48550/ARXIV.2402.02105]
[6]
DisWOT: Student Architecture Search for Distillation WithOut Training
[J].
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR),
2023,
:11898-11908
[7]
Dong PJ, 2022, Arxiv, DOI arXiv:2206.13329
[8]
Du S, 2020, Advances in Neural Information Processing Systems, V33, P12345
[9]
Efficient Knowledge Distillation from an Ensemble of Teachers
[J].
18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION,
2017,
:3697-3701
[10]
Hinton G, 2015, Arxiv, DOI arXiv:1503.02531