Learning Deep Representations with Probabilistic Knowledge Transfer

被引:329
作者
Passalis, Nikolaos [1 ]
Tefas, Anastasios [1 ]
机构
[1] Aristotle Univ Thessaloniki, Thessaloniki 54124, Greece
来源
COMPUTER VISION - ECCV 2018, PT XI | 2018年 / 11215卷
关键词
Knowledge transfer; Neural network distillation;
D O I
10.1007/978-3-030-01252-6_17
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge Transfer (KT) techniques tackle the problem of transferring the knowledge from a large and complex neural network into a smaller and faster one. However, existing KT methods are tailored towards classification tasks and they cannot be used efficiently for other representation learning tasks. In this paper we propose a novel probabilistic knowledge transfer method that works by matching the probability distribution of the data in the feature space instead of their actual representation. Apart from outperforming existing KT techniques, the proposed method allows for overcoming several of their limitations providing new insight into KT as well as novel KT applications, ranging from KT from handcrafted feature extractors to cross-modal KT from the textual modality into the representation extracted from the visual modality of the data.
引用
收藏
页码:283 / 299
页数:17
相关论文
共 50 条
[1]  
[Anonymous], 1993, Bandwidth Selection in Kernel Density Estimation: A Review
[2]  
[Anonymous], 2012, P INT C NEUR INF PRO
[3]  
[Anonymous], 2017, P IEEE INT C COMP VI
[4]  
[Anonymous], ARXIV151105641
[5]  
[Anonymous], 2017, P IEEE INT C COMP VI
[6]  
[Anonymous], 2015, ARXIV150401483
[7]  
[Anonymous], 2012, ELEMENTS INFORM THEO
[8]  
[Anonymous], 2008, Introduction to information retrieval
[9]  
[Anonymous], 2016, CVPR CVPR
[10]  
[Anonymous], 2007, Tech. Rep