DisWOT: Student Architecture Search for Distillation WithOut Training

被引:23
作者
Dong, Peijie [1 ]
Li, Lujun [2 ]
Wei, Zimian [1 ]
机构
[1] Natl Univ Def Technol, Changsha, Peoples R China
[2] Chinese Acad Sci, Beijing 100864, Peoples R China
来源
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2023年
关键词
D O I
10.1109/CVPR52729.2023.01145
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation (KD) is an effective training strategy to improve the lightweight student models under the guidance of cumbersome teachers. However, the large architecture difference across the teacher-student pairs limits the distillation gains. In contrast to previous adaptive distillation methods to reduce the teacher-student gap, we explore a novel training-free framework to search for the best student architectures for a given teacher. Our work first empirically show that the optimal model under vanilla training cannot be the winner in distillation. Secondly, we find that the similarity of feature semantics and sample relations between random-initialized teacher-student networks have good correlations with final distillation performances. Thus, we efficiently measure similarity matrixs conditioned on the semantic activation maps to select the optimal student via an evolutionary algorithm without any training. In this way, our student architecture search for Distillation WithOut Training (DisWOT) significantly improves the performance of the model in the distillation stage with at least 180x training acceleration. Additionally, we extend similarity metrics in DisWOT as new distillers and KD-based zero-proxies. Our experiments on CIFAR, ImageNet and NAS-Bench-201 demonstrate that our technique achieves state-of-the-art results on different search spaces. Our project and code are available at https://lilujunai.github.io/DisWOT-CVPR2023/.
引用
收藏
页码:11898 / 11908
页数:11
相关论文
共 79 条
[1]  
Abdelfattah Mohamed S, 2020, ICLR
[2]   Effect of Cu/CeO2 catalyst preparation methods on their characteristics for low temperature water - gas shift reaction: A detailed study [J].
Ahn, Seon-Yong ;
Na, Hyun-Suk ;
Jeon, Kyung-Won ;
Lee, Yeol-Lim ;
Kim, Kyoung-Jin ;
Shim, Jae-Oh ;
Roh, Hyun-Seog .
CATALYSIS TODAY, 2020, 352 :166-174
[3]  
[Anonymous], CVPR
[4]  
[Anonymous], 2018, NeurIPS
[5]  
[Anonymous], ECCV
[6]  
[Anonymous], P 2022 ACMSP INT C
[7]  
[Anonymous], 2018, ICML
[8]  
[Anonymous], 2022, METHODS MOL BIOL, DOI DOI 10.1007/978-1-0716-1708-3
[9]  
[Anonymous], P 2022 ACMSP INT C
[10]  
[Anonymous], 2020, CVPR, DOI DOI 10.1109/CVPR42600.2020.00362