Distribution Structure Learning Loss (DSLL) Based on Deep Metric Learning for Image Retrieval

被引:12
|
作者
Fan, Lili [1 ]
Zhao, Hongwei [1 ,2 ]
Zhao, Haoyu [3 ]
Liu, Pingping [1 ,2 ]
Hu, Huangshui [4 ]
机构
[1] Jilin Univ, Coll Comp Sci & Technol, Changchun 130012, Jilin, Peoples R China
[2] Jilin Univ, Key Lab Symbol Computat & Knowledge Engn, Minist Educ, Changchun 130012, Jilin, Peoples R China
[3] Jilin Univ, Editorial Dept Journal Engn & Technol Edit, Changchun 130012, Jilin, Peoples R China
[4] Changchun Univ Technol, Sch Comp Sci & Engn, Changchun 130012, Jilin, Peoples R China
基金
中国国家自然科学基金;
关键词
deep metric learning; entropy weight; fine-tune network; image retrieval; structural preservation; structural ranking consistency; OBJECT RETRIEVAL; FACE;
D O I
10.3390/e21111121
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The massive number of images demands highly efficient image retrieval tools. Deep distance metric learning (DDML) is proposed to learn image similarity metrics in an end-to-end manner based on the convolution neural network, which has achieved encouraging results. The loss function is crucial in DDML frameworks. However, we found limitations to this model. When learning the similarity of positive and negative examples, the current methods aim to pull positive pairs as close as possible and separate negative pairs into equal distances in the embedding space. Consequently, the data distribution might be omitted. In this work, we focus on the distribution structure learning loss (DSLL) algorithm that aims to preserve the geometric information of images. To achieve this, we firstly propose a metric distance learning for highly matching figures to preserve the similarity structure inside it. Second, we introduce an entropy weight-based structural distribution to set the weight of the representative negative samples. Third, we incorporate their weights into the process of learning to rank. So, the negative samples can preserve the consistency of their structural distribution. Generally, we display comprehensive experimental results drawing on three popular landmark building datasets and demonstrate that our method achieves state-of-the-art performance.
引用
收藏
页数:22
相关论文
共 50 条
  • [41] Content based image retrieval using deep learning process
    Saritha, R. Rani
    Paul, Varghese
    Kumar, P. Ganesh
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2019, 22 (02): : S4187 - S4200
  • [42] Deep Metric Learning: Loss Functions Comparison
    Vasilev, R. L.
    D'yakonov, A. G.
    DOKLADY MATHEMATICS, 2023, 108 (SUPPL 2) : S215 - S225
  • [43] Deep Metric Learning with Tuplet Margin Loss
    Yu, Baosheng
    Tao, Dacheng
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 6499 - 6508
  • [44] Collaborative image retrieval via regularized metric learning
    Si, Luo
    Jin, Rong
    Hoi, Steven C. H.
    Lyu, Michael R.
    MULTIMEDIA SYSTEMS, 2006, 12 (01) : 34 - 44
  • [45] Collaborative image retrieval via regularized metric learning
    Luo Si
    Rong Jin
    Steven C. H. Hoi
    Michael R. Lyu
    Multimedia Systems, 2006, 12 : 34 - 44
  • [46] Ranked List Loss for Deep Metric Learning
    Wang, Xinshao
    Hua, Yang
    Kodirov, Elyor
    Hu, Guosheng
    Garnier, Romain
    Robertson, Neil M.
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 5202 - 5211
  • [47] Ranked List Loss for Deep Metric Learning
    Wang, Xinshao
    Hua, Yang
    Kodirov, Elyor
    Robertson, Neil M.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) : 5414 - 5429
  • [48] Deep Metric Learning with Hierarchical Triplet Loss
    Ge, Weifeng
    Huang, Weilin
    Dong, Dengke
    Scott, Matthew R.
    COMPUTER VISION - ECCV 2018, PT VI, 2018, 11210 : 272 - 288
  • [49] Proxy Anchor Loss for Deep Metric Learning
    Kim, Sungyeon
    Kim, Dongwon
    Cho, Minsu
    Kwak, Suha
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 3235 - 3244
  • [50] Deep Metric Learning: Loss Functions Comparison
    R. L. Vasilev
    A. G. D’yakonov
    Doklady Mathematics, 2023, 108 : S215 - S225