DisWOT: Student Architecture Search for Distillation WithOut Training

被引:23
作者
Dong, Peijie [1 ]
Li, Lujun [2 ]
Wei, Zimian [1 ]
机构
[1] Natl Univ Def Technol, Changsha, Peoples R China
[2] Chinese Acad Sci, Beijing 100864, Peoples R China
来源
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2023年
关键词
D O I
10.1109/CVPR52729.2023.01145
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation (KD) is an effective training strategy to improve the lightweight student models under the guidance of cumbersome teachers. However, the large architecture difference across the teacher-student pairs limits the distillation gains. In contrast to previous adaptive distillation methods to reduce the teacher-student gap, we explore a novel training-free framework to search for the best student architectures for a given teacher. Our work first empirically show that the optimal model under vanilla training cannot be the winner in distillation. Secondly, we find that the similarity of feature semantics and sample relations between random-initialized teacher-student networks have good correlations with final distillation performances. Thus, we efficiently measure similarity matrixs conditioned on the semantic activation maps to select the optimal student via an evolutionary algorithm without any training. In this way, our student architecture search for Distillation WithOut Training (DisWOT) significantly improves the performance of the model in the distillation stage with at least 180x training acceleration. Additionally, we extend similarity metrics in DisWOT as new distillers and KD-based zero-proxies. Our experiments on CIFAR, ImageNet and NAS-Bench-201 demonstrate that our technique achieves state-of-the-art results on different search spaces. Our project and code are available at https://lilujunai.github.io/DisWOT-CVPR2023/.
引用
收藏
页码:11898 / 11908
页数:11
相关论文
共 79 条
[11]  
[Anonymous], ICCV
[12]  
[Anonymous], P 2022 ACMSP INT C
[13]  
[Anonymous], 2018, ICML
[14]  
[Anonymous], CVPR
[15]  
[Anonymous], 2019, RANDOM SEARCH REPRO, DOI DOI 10.1109/ICCV.2019.01041
[16]  
Baker Bowen, 2017, P INT C LEARN REPR
[17]   Knowledge distillation: A good teacher is patient and consistent [J].
Beyer, Lucas ;
Zhai, Xiaohua ;
Royer, Amelie ;
Markeeva, Larisa ;
Anil, Rohan ;
Kolesnikov, Alexander .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, :10915-10924
[18]  
Brown TB, 2020, ADV NEUR IN, V33
[19]  
Chen Dian, 2022, CVPR
[20]  
Chen K., 2022, CVPRW