Deep Metric Learning with BIER: Boosting Independent Embeddings Robustly

被引:92
作者
Opitz, Michael [1 ]
Waltner, Georg [1 ]
Possegger, Horst [1 ]
Bischof, Horst [1 ]
机构
[1] Graz Univ Technol, Inst Comp Graph & Vis, A-8010 Graz, Austria
关键词
Measurement; Training; Boosting; Correlation; Feature extraction; Robustness; Task analysis; Metric learning; deep learning; convolutional neural network;
D O I
10.1109/TPAMI.2018.2848925
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning similarity functions between image pairs with deep neural networks yields highly correlated activations of embeddings. In this work, we show how to improve the robustness of such embeddings by exploiting the independence within ensembles. To this end, we divide the last embedding layer of a deep network into an embedding ensemble and formulate the task of training this ensemble as an online gradient boosting problem. Each learner receives a reweighted training sample from the previous learners. Further, we propose two loss functions which increase the diversity in our ensemble. These loss functions can be applied either for weight initialization or during training. Together, our contributions leverage large embedding sizes more effectively by significantly reducing correlation of the embedding and consequently increase retrieval accuracy of the embedding. Our method works with any differentiable loss function and does not introduce any additional parameters during test time. We evaluate our metric learning method on image retrieval tasks and show that it improves over state-of-the-art methods on the CUB-200-2011, Cars-196, Stanford Online Products, In-Shop Clothes Retrieval and VehicleID datasets. Therefore, our findings suggest that by dividing deep networks at the end into several smaller and diverse networks, we can significantly reduce overfitting.
引用
收藏
页码:276 / 290
页数:15
相关论文
共 74 条
  • [1] Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
  • [2] [Anonymous], 2016, P INT C LEARN REPR
  • [3] [Anonymous], P AS C COMP VIS
  • [4] [Anonymous], 2012, P ADV NEUR INF PROC
  • [5] [Anonymous], 2012, P 29 INT COF INT C M
  • [6] [Anonymous], P INT C LEARN REPR
  • [7] [Anonymous], P INT C LEARN REPR
  • [8] [Anonymous], P COMP VIS WINT WORK
  • [9] [Anonymous], 2013, SURVEY METRIC LEARNI
  • [10] [Anonymous], ADV NEURAL INF PROCE