Part-aware Progressive Unsupervised Domain Adaptation for Person Re-Identification

被引:69
作者
Yang, Fan [1 ]
Yan, Ke [2 ]
Lu, Shijian [3 ]
Jia, Huizhu [1 ]
Xie, Don [1 ]
Yu, Zongqiao [2 ]
Guo, Xiaowei [2 ]
Huang, Feiyue [2 ]
Gao, Wen [1 ]
机构
[1] Peking Univ, Natl Engn Lab Video Technol, Beijing 100871, Peoples R China
[2] Tencent, Youtu Lab, Shanghai 201103, Peoples R China
[3] Nanyang Technol Univ, Singapore 639798, Singapore
关键词
Feature extraction; Cameras; Dictionaries; Adaptation models; Supervised learning; Measurement; Image color analysis; Unsupervised domain adaptation; Person re-identification; Part-aware; Feature alignment; Progressive adaptation; TRACKING; DESCRIPTOR; RETRIEVAL; NETWORK;
D O I
10.1109/TMM.2020.3001522
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Unsupervised domain adaptation (UDA) aims to mitigate the domain shift that occurs when transferring knowledge from a labeled source domain to an unlabeled target domain. While it has been studied for application in unsupervised person re-identification (ReID), the relations of feature distribution across the source and target domains remain underexplored, as they either ignore the local relations or omit the in-depth consideration of negative transfer when two domains do not share identical label spaces. In light of the above, this paper presents an innovative part-aware progressive adaptation network (PPAN) that exploits global and local relations for UDA-based ReID across domains. A multi-branch network is developed that explicitly learns discriminative feature representation from both whole-body images and body-part images under the supervision of a labeled source domain. Within each network branch, an independent UDA constraint is designed that aligns the global and local feature distributions from a labeled source domain with those of an unlabeled target domain. In addition, a novel progressive adaptation strategy (PAS) is designed that effectively alleviates the negative influence of outlier source identities. The proposed unsupervised ReID model is evaluated on five widely used datasets (Market-1501, DukeMTMC-reID, CUHK03, VIPeR and PRID), and experimental results demonstrate its superior robustness and effectiveness relative to state-of-the-art approaches.
引用
收藏
页码:1681 / 1695
页数:15
相关论文
共 80 条
[1]  
Bai S, 2017, AAAI CONF ARTIF INTE, P1281
[2]  
Bousmalis K, 2016, ADV NEUR IN, V29
[3]   Unsupervised Pixel-Level Domain Adaptation with Generative Adversarial Networks [J].
Bousmalis, Konstantinos ;
Silberman, Nathan ;
Dohan, David ;
Erhan, Dumitru ;
Krishnan, Dilip .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :95-104
[4]   Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields [J].
Cao, Zhe ;
Simon, Tomas ;
Wei, Shih-En ;
Sheikh, Yaser .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1302-1310
[5]  
Chang XB, 2019, AAAI CONF ARTIF INTE, P3288
[6]   Beyond triplet loss: a deep quadruplet network for person re-identification [J].
Chen, Weihua ;
Chen, Xiaotang ;
Zhang, Jianguo ;
Huang, Kaiqi .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1320-1329
[7]   Person Re-Identification by Multi-Channel Parts-Based CNN with Improved Triplet Loss Function [J].
Cheng, De ;
Gong, Yihong ;
Zhou, Sanping ;
Wang, Jinjun ;
Zheng, Nanning .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :1335-1344
[8]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
[9]   Image-Image Domain Adaptation with Preserved Self-Similarity and Domain-Dissimilarity for Person Re-identification [J].
Deng, Weijian ;
Zheng, Liang ;
Ye, Qixiang ;
Kang, Guoliang ;
Yang, Yi ;
Jiao, Jianbin .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :994-1003
[10]   Feature Affinity-Based Pseudo Labeling for Semi-Supervised Person Re-Identification [J].
Ding, Guodong ;
Zhang, Shanshan ;
Khan, Salman ;
Tang, Zhenmin ;
Zhang, Jian ;
Porikli, Fatih .
IEEE TRANSACTIONS ON MULTIMEDIA, 2019, 21 (11) :2891-2902