Search for Better Students to Learn Distilled Knowledge

被引:3
作者
Gu, Jindong [1 ,2 ]
Tresp, Volker [1 ,2 ]
机构
[1] Univ Munich, Munich, Germany
[2] Corp Technol, Siemens AG, Munich, Germany
来源
ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE | 2020年 / 325卷
关键词
D O I
10.3233/FAIA200214
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge Distillation, as a model compression technique, has received great attention. The knowledge of a well-performed teacher is distilled to a student with a small architecture. The architecture of the small student is often chosen to be similar to their teacher's, with fewer layers or fewer channels, or both. However, even with the same number of FLOPs or parameters, the students with different architecture can achieve different generalization ability. The configuration of a student architecture requires intensive network architecture engineering. In this work, instead of designing a good student architecture manually, we propose to search for the optimal student automatically. Based on L1-norm optimization, a subgraph from the teacher network topology graph is selected as a student, the goal of which is to minimize the KL-divergence between student's and teacher's outputs. We verify the proposal on CIFAR10 and CIFAR100 datasets. The empirical experiments show that the learned student architecture achieves better performance than ones specified manually. We also visualize and understand the architecture of the found student.
引用
收藏
页码:1159 / 1165
页数:7
相关论文
共 38 条
  • [1] [Anonymous], 1993, NeurIPS
  • [2] Ba LJ., 2013, ADV NEURAL INFORM PR, V3, P2654, DOI DOI 10.5555/2969033.2969123
  • [3] Bucilua C., 2006, P 12 ACM SIGKDD INT, P535, DOI DOI 10.1145/1150402.1150464
  • [4] Courbariaux M., 2016, BINARIZED NEURAL NET
  • [5] Crowley E. J., 2018, NEURIPS
  • [6] Cun Y. L, 1990, Adv. Neural Inf. Process. Syst., V2, P396
  • [7] Czarnecki WM, 2017, ADV NEUR IN, V30
  • [8] Denton Emily L, 2014, NEURIPS
  • [9] Gu Jindong, 2018, ACCV
  • [10] Han S., 2015, P 28 INT C NEUR INF, DOI DOI 10.5555/2969239.2969366