Attribute based spatio-temporal person retrieval in video surveillance

被引:4
作者
Shoitan, Rasha [1 ]
Moussa, Mona M. [1 ]
El Nemr, Heba A. [1 ,2 ]
机构
[1] Elect Res Inst, Comp & Syst Dept, Cairo, Egypt
[2] Misr Univ Sci & Technol, Comp & Software Engn, October City, Egypt
关键词
Multi -object tracking; Person retrieval; Attributes description; OBJECT TRACKING;
D O I
10.1016/j.aej.2022.07.053
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Many venues, such as airports, railway stations, and shopping malls, have video surveillance systems for security and monitoring. However, searching for and retrieving people based on attribute descriptions in a large number of videos is difficult, particularly with weather variations and crowded places. Most of the existing attribute-based person retrieval systems consist of two main modules: object detection and person attribute recognition. The common drawbacks of object detection in the existing methods are false-positive, missing detection, and multi bounding boxes for the same object. Moreover, attribute recognition algorithms suffer from low accuracy for a single attribute classifier, while attributes error spread in the cascading multi-attribute classifier. This paper overcomes these issues by applying the ByteTrack algorithm instead of object detection to exploit the person's spatio-temporal information and generate a tube that maintains all the boxes that include the objects and associates high and low score boxes of the objects without raising false positive detection. Also, linking each person bounding boxes together results in more accurate attributes recognition than defining the attributes of each bounding box separately. Moreover, the proposed algorithm merges between selected predictions of two attribute recognition algorithms to improve the recognition performance. An extensive empirical evaluation was carried out on the SoftBioSearch database. The simulation results reveal that the proposed retrieval algorithm provides effective retrieval performance that exceeds the best conventional method by 14%.(c) 2022 THE AUTHORS. Published by Elsevier BV on behalf of Faculty of Engineering, Alexandria University. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/ licenses/by-nc-nd/4.0/).
引用
收藏
页码:441 / 454
页数:14
相关论文
共 59 条
[41]   SiamMOT: Siamese Multi-Object Tracking [J].
Shuai, Bing ;
Berneshawi, Andrew ;
Li, Xinyu ;
Modolo, Davide ;
Tighe, Joseph .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :12367-12377
[42]   Person Attribute Recognition with a Jointly-trained Holistic CNN Model [J].
Sudowe, Patrick ;
Spitzer, Hannah ;
Leibe, Bastian .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOP (ICCVW), 2015, :329-337
[43]   Attention-Based Pedestrian Attribute Analysis [J].
Tan, Zichang ;
Yang, Yang ;
Wan, Jun ;
Hang, Hanyuan ;
Guo, Guodong ;
Li, Stan Z. .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (12) :6126-6140
[44]   Improving Pedestrian Attribute Recognition With Weakly-Supervised Multi-Scale Attribute-Specific Localization [J].
Tang, Chufeng ;
Sheng, Lu ;
Zhang, Zhaoxiang ;
Hu, Xiaolin .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :4996-5005
[45]   Siamese Instance Search for Tracking [J].
Tao, Ran ;
Gavves, Efstratios ;
Smeulders, Arnold W. M. .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :1420-1429
[46]  
Tsai R. Y., 1986, Proceedings CVPR '86: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.86CH2290-5), P364
[47]  
TSAI RY, 1987, IEEE T ROBOTIC AUTOM, V3, P323, DOI 10.1109/JRA.1987.1087109
[48]   Pedestrian attribute recognition: A survey [J].
Wang, Xiao ;
Zheng, Shaofei ;
Yang, Rui ;
Zheng, Aihua ;
Chen, Zhe ;
Tang, Jin ;
Luo, Bin .
PATTERN RECOGNITION, 2022, 121
[49]   Object tracking via dense SIFT features and low-rank representation [J].
Wang, Yong ;
Luo, Xinbin ;
Ding, Lu ;
Wu, Jingjing .
SOFT COMPUTING, 2019, 23 (20) :10173-10186
[50]  
Wojke N, 2017, IEEE IMAGE PROC, P3645, DOI 10.1109/ICIP.2017.8296962