Self-Supervised Synthesis Ranking for Deep Metric Learning

被引:16
|
作者
Fu, Zheren [1 ]
Mao, Zhendong [2 ]
Yan, Chenggang [3 ]
Liu, An-An [4 ]
Xie, Hongtao [2 ]
Zhang, Yongdong [2 ,5 ]
机构
[1] Univ Sci & Technol China, Sch Cyberspace Sci & Technol, Hefei 230027, Peoples R China
[2] Univ Sci & Technol China, Sch Informat Sci & Technol, Hefei 230022, Peoples R China
[3] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Peoples R China
[4] Tianjin Univ, Sch Elect & Informat Engn, Tianjin 300072, Peoples R China
[5] Inst Artificial Intelligence, Hefei Comprehens Natl Sci Ctr, Hefei 230022, Peoples R China
基金
中国国家自然科学基金;
关键词
Measurement; Semantics; Transforms; Training; Task analysis; Coordinate measuring machines; Manifolds; Deep metric learning; image retrieval; self-supervised learning; generative model; PERSON REIDENTIFICATION;
D O I
10.1109/TCSVT.2021.3124908
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The core purpose of deep metric learning is to construct an embedding space, where objects belonging to the same class are gathered together and the ones from different classes are pushed apart. Most existing approaches typically insist to inter-class characteristics, e.g., class-level information or instance-level similarity, to obtain semantic relevance of data points and get a large margin between different classes in the embedding space. However, the intra-class characteristics, e.g., local manifold structure or relative relationship within the same class, are usually overlooked in the learning process. Hence the output embeddings have limitation in retrieving a good ranking result if existing multiple positive samples. And the local data structure of embedding space cannot be fully exploited since lack of relative ranking information. As a result, the model is prone to overfitting on a train set and get low generalization on the test set (unseen classes) when losing sight of intra-class variance. This paper presents a novel self-supervised synthesis ranking auxiliary framework, which captures intra-class characteristics as well as inter-class characteristics for better metric learning. Our method designs a synthetic samples generation of polar coordinates to generate measurable intra-class variance with different strength and diversity in the latent space, which can simulate the various local structure change of intra-class in the initial data domain. And then formulates a self-supervised learning procedure to fully exploit this property and preserve it in the embedding space. As a result, the learned embedding space not only keeps inter-class discrimination but also owns subtle intra-class diversity, leading to better global and local embedding structures. Extensive experiments on five benchmarks show that our method significantly improves and outperforms the state-of-the-art methods on the performances of both retrieval and ranking by 2%-4% (personal use of this material is permitted. However, permission to use this material for any other purposes must be obtained from the IEEE by sending an email to pubs-permissions@ieee.org).
引用
收藏
页码:4736 / 4750
页数:15
相关论文
共 50 条
  • [1] Self-Supervised Deep Metric Learning for Pointsets
    Arsomngern, Pattaramanee
    Long, Cheng
    Suwajanakorn, Supasorn
    Nutanong, Sarana
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 2171 - 2176
  • [2] Self-supervised deep metric learning for ancient papyrus fragments retrieval
    Pirrone, Antoine
    Beurton-Aimar, Marie
    Journet, Nicholas
    INTERNATIONAL JOURNAL ON DOCUMENT ANALYSIS AND RECOGNITION, 2021, 24 (03) : 219 - 234
  • [3] Self-supervised deep metric learning for ancient papyrus fragments retrieval
    Antoine Pirrone
    Marie Beurton-Aimar
    Nicholas Journet
    International Journal on Document Analysis and Recognition (IJDAR), 2021, 24 : 219 - 234
  • [4] Semi- and Self-Supervised Metric Learning for Remote Sensing Applications
    Hernandez-Sequeira, Itza
    Fernandez-Beltran, Ruben
    Pla, Filiberto
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21 : 1 - 5
  • [5] Self-Supervised Lie Algebra Representation Learning via Optimal Canonical Metric
    Yu, Xiaohan
    Pan, Zicheng
    Zhao, Yang
    Gao, Yongsheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 3547 - 3558
  • [6] Self-Supervised Learning of Audio Representations From Permutations With Differentiable Ranking
    Carr, Andrew N.
    Berthet, Quentin
    Blondel, Mathieu
    Teboul, Olivier
    Zeghidour, Neil
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 708 - 712
  • [7] Neighborhood-Adaptive Multi-Cluster Ranking for Deep Metric Learning
    Li, Pandeng
    Xie, Hongtao
    Jiang, Yan
    Ge, Jiannan
    Zhang, Yongdong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (04) : 1952 - 1965
  • [8] Self-Supervised Deep Correlation Tracking
    Yuan, Di
    Chang, Xiaojun
    Huang, Po-Yao
    Liu, Qiao
    He, Zhenyu
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 976 - 985
  • [9] Deep Bregman divergence for self-supervised representations learning
    Rezaei, Mina
    Soleymani, Farzin
    Bischl, Bernd
    Azizi, Shekoofeh
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 235
  • [10] Deep Self-Supervised Diversity Promoting Learning on Hierarchical Hyperspheres for Regularization
    Kim, Youngsung
    Hyun, Yoonsuk
    Han, Jae-Joon
    Yang, Eunho
    Hwang, Sung Ju
    Shin, Jinwoo
    IEEE ACCESS, 2023, 11 : 146208 - 146222