Pose-Guided Feature Alignment for Occluded Person Re-Identification

被引:506
作者
Miao, Jiaxu [1 ,2 ]
Wu, Yu [1 ,2 ]
Liu, Ping [2 ]
Ding, Yuhang [1 ]
Yang, Yi [2 ]
机构
[1] Baidu Res, Melbourne, Vic, Australia
[2] Univ Technol Sydney, ReLER, Sydney, NSW, Australia
来源
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019) | 2019年
关键词
D O I
10.1109/ICCV.2019.00063
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Persons are often occluded by various obstacles in person retrieval scenarios. Previous person re-identification (re-id) methods, either overlook this issue or resolve it based on an extreme assumption. To alleviate the occlusion problem, we propose to detect the occluded regions, and explicitly exclude those regions during feature generation and matching. In this paper, we introduce a novel method named Pose-Guided Feature Alignment (PGFA), exploiting pose landmarks to disentangle the useful information from the occlusion noise. During the feature constructing stage, our method utilizes human landmarks to generate attention maps. The generated attention maps indicate if a specific body part is occluded and guide our model to attend to the non-occluded regions. During matching, we explicitly partition the global feature into parts and use the pose landmarks to indicate which partial features belonging to the target person. Only the visible regions are utilized for the retrieval. Besides, we construct a large-scale dataset for the Occluded Person Re-ID problem, namely Occluded-DukeMTMC, which is by far the largest dataset for the Occlusion Person Re-ID. Extensive experiments are conducted on our constructed occluded re-id dataset, two partial reid datasets, and two commonly used holistic re-id datasets. Our method largely outperforms existing person re-id methods on three occlusion datasets, while remains top performance on two holistic datasets.
引用
收藏
页码:542 / 551
页数:10
相关论文
共 49 条
[1]  
[Anonymous], 2017, ICCV
[2]  
[Anonymous], 2011, CVPR
[3]  
[Anonymous], 2018, ICME
[4]  
[Anonymous], 2015, Arxiv.Org, DOI DOI 10.3389/FPSYG.2013.00124
[5]  
[Anonymous], INT C PATT REC
[6]  
[Anonymous], 2015, CVPR
[7]  
[Anonymous], 2017, ARXIV171108106
[8]  
[Anonymous], 2018, CVPR
[9]  
[Anonymous], IEEE INT C COMP VIS
[10]  
[Anonymous], 2017, IJCAI