BIER - Boosting Independent Embeddings Robustly

被引:94
作者
Opitz, Michael [1 ]
Waltner, Georg [1 ]
Possegger, Horst [1 ]
Bischof, Horst [1 ]
机构
[1] Graz Univ Technol, Inst Comp Graph & Vis, Graz, Austria
来源
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV) | 2017年
关键词
D O I
10.1109/ICCV.2017.555
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning similarity functions between image pairs with deep neural networks yields highly correlated activations of large embeddings. In this work, we show how to improve the robustness of embeddings by exploiting independence in ensembles. We divide the last embedding layer of a deep network into an embedding ensemble and formulate training this ensemble as an online gradient boosting problem. Each learner receives a reweighted training sample from the previous learners. This leverages large embedding sizes more effectively by significantly reducing correlation of the embedding and consequently increases retrieval accuracy of the embedding. Our method does not introduce any additional parameters and works with any differentiable loss function. We evaluate our metric learning method on image retrieval tasks and show that it improves over state-of-the-art methods on the CUB-200-2011, Cars-196, Stanford Online Products, In-Shop Clothes Retrieval and VehicleID datasets by a significant margin.
引用
收藏
页码:5199 / 5208
页数:10
相关论文
共 56 条
[1]  
[Anonymous], 2015, CVPR
[2]  
[Anonymous], 2015, P BMVC
[3]  
[Anonymous], 2016, P NIPS
[4]  
[Anonymous], 2006, P CVPR
[5]  
[Anonymous], P ICCV
[6]  
[Anonymous], 2012, P ADV NEUR INF PROC
[7]  
[Anonymous], 2015, P CVPR
[8]  
[Anonymous], P NIPS
[9]  
[Anonymous], 2015, P ICCV
[10]  
[Anonymous], DATABASE OXFORD