共 20 条
- [1] Deep Model Compression and Architecture Optimization for Embedded Systems: A Survey [J]. JOURNAL OF SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2021, 93 (08): : 863 - 878
- [2] Distill on the Go: Online knowledge distillation in self-supervised learning [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2672 - 2681
- [3] On the Efficacy of Knowledge Distillation [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 4793 - 4801
- [4] Dubey A, 2018, 32 C NEURAL INFORM P, V31
- [5] Teaching Yourself: A Self-Knowledge Distillation Approach to Action Recognition [J]. IEEE ACCESS, 2021, 9 : 105711 - 105723
- [6] Furlanello T, 2018, PR MACH LEARN RES, V80
- [7] Knowledge Distillation: A Survey [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (06) : 1789 - 1819
- [8] Haarnoja T, 2018, PR MACH LEARN RES, V80
- [9] Video Representation Learning by Dense Predictive Coding [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 1483 - 1492
- [10] Hinton Geoffrey, 2015, DISTILLING KNOWLEDGE