Collaborative Contrastive Refining for Weakly Supervised Person Search

被引:3
作者
Jia, Chengyou [1 ,2 ]
Luo, Minnan [1 ,2 ]
Yan, Caixia [1 ,2 ]
Zhu, Linchao [3 ]
Chang, Xiaojun [4 ,5 ]
Zheng, Qinghua [1 ,2 ]
机构
[1] Xi An Jiao Tong Univ, Sch Comp Sci & Technol, Xian 710049, Shaanxi, Peoples R China
[2] Xi An Jiao Tong Univ, Key Lab Intelligent Networks & Network Secur, Minist Educ, Xian 710049, Shaanxi, Peoples R China
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
[4] Univ Technol Sydney, Fac Informat Technol, Ultimo, NSW 2007, Australia
[5] Mohamed Bin Zayed Univ Artificial Intelligence MBZ, Dept Comp Vis, Abu Dhabi, U Arab Emirates
基金
中国国家自然科学基金;
关键词
Person search; unsupervised person Re-ID; weakly supervised learning; clustering algorithm; REIDENTIFICATION; NETWORK; LOCALIZATION;
D O I
10.1109/TIP.2023.3308393
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Weakly supervised person search involves training a model with only bounding box annotations, without human-annotated identities. Clustering algorithms are commonly used to assign pseudo-labels to facilitate this task. However, inaccurate pseudo-labels and imbalanced identity distributions can result in severe label and sample noise. In this work, we propose a novel Collaborative Contrastive Refining (CCR) weakly-supervised framework for person search that jointly refines pseudo-labels and the sample-learning process with different contrastive strategies. Specifically, we adopt a hybrid contrastive strategy that leverages both visual and context clues to refine pseudo-labels, and leverage the sample-mining and noise-contrastive strategy to reduce the negative impact of imbalanced distributions by distinguishing positive samples and noise samples. Our method brings two main advantages: 1) it facilitates better clustering results for refining pseudo-labels by exploring the hybrid similarity; 2) it is better at distinguishing query samples and noise samples for refining the sample-learning process. Extensive experiments demonstrate the superiority of our approach over the state-of-the-art weakly supervised methods by a large margin (more than 3% mAP on CUHK-SYSU). Moreover, by leveraging more diverse unlabeled data, our method achieves comparable or even better performance than the state-of-the-art supervised methods.
引用
收藏
页码:4951 / 4963
页数:13
相关论文
共 66 条
  • [1] Cascade R-CNN: Delving into High Quality Object Detection
    Cai, Zhaowei
    Vasconcelos, Nuno
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 6154 - 6162
  • [2] PSTR: End-to-End One-Step Person Search With Transformers
    Cao, Jiale
    Pang, Yanwei
    Anwer, Rao Muhammad
    Cholakkal, Hisham
    Xie, Jin
    Shah, Mubarak
    Khan, Fahad Shahbaz
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9448 - 9457
  • [3] RCAA: Relational Context-Aware Agents for Person Search
    Chang, Xiaojun
    Huang, Po-Yao
    Shen, Yi-Dong
    Liang, Xiaodan
    Yang, Yi
    Hauptmann, Alexander G.
    [J]. COMPUTER VISION - ECCV 2018, PT IX, 2018, 11213 : 86 - 102
  • [4] Chen D., 2020, P AAAI C ART INT
  • [5] Norm-Aware Embedding for Efficient Person Search
    Chen, Di
    Zhang, Shanshan
    Yang, Jian
    Schiele, Bernt
    [J]. 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, : 12612 - 12621
  • [6] Person Search by Separated Modeling and A Mask-Guided Two-Stream CNN Model
    Chen, Di
    Zhang, Shanshan
    Ouyang, Wanli
    Yang, Jian
    Tai, Ying
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 (29) : 4669 - 4682
  • [7] Cheng D., 2022, IEEE T IMAGE PROCESS, V31, P3334, DOI DOI 10.1109/TIP.2022.3169693
  • [8] Cheng D, 2022, Arxiv, DOI arXiv:2211.16847
  • [9] Histograms of oriented gradients for human detection
    Dalal, N
    Triggs, B
    [J]. 2005 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2005, : 886 - 893
  • [10] Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848