Pedestrian Group Re-Identification and Trajectory Prediction Through Zone-Based Clustering

被引:0
作者
Chen, Mingzuoyang [1 ]
Banitaan, Shadi [1 ]
Maleki, Mina [1 ]
机构
[1] Univ Detroit Mercy, Dept Elect & Comp Engn & Comp Sci, Detroit, MI 48221 USA
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Trajectory; Pedestrians; Accuracy; Predictive models; Feature extraction; Object detection; Long short term memory; Cameras; Computer vision; Identification of persons; Hungarian algorithm; human view camera; LSTM; re-identification; trajectory prediction; zone-based group detection; NETWORK; LSTM;
D O I
10.1109/ACCESS.2024.3428438
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Pedestrian trajectory prediction is a critical aspect of computer vision, aimed at predicting a pedestrian's future locations by analyzing their past movements. Traditional trajectory prediction models primarily focus on individuals, which can be challenging in densely populated areas due to occlusions. These occlusions not only complicate the re-identification of pedestrians once they reappear but also increase processing times due to the more complex procedures and the greater number of objects involved. However, a common observation is that pedestrians often travel in groups. This insight led us to propose a novel approach, predicting the future trajectories of pedestrian groups instead of individuals. This strategy effectively addresses the complexities of predicting movements in crowded environments and the issues related to pedestrian occlusion. In this work, we introduced a distinctive methodology for identifying pedestrian groups, re-identifying them, and predicting their future trajectories. Our approach, unlike traditional state-of-the-art re-identification and trajectory prediction methods of individual pedestrians, focuses on re-identifying pedestrian groups and predicting their future trajectories while emphasizing processing time reduction with great accuracy. The process started with object detection to ascertain pedestrian coordinates. Subsequently, a zone-based clustering method was employed to form groups. Following this, a specific group re-identification was utilized to construct continuous trajectories for these groups, rather than for individual pedestrians. Finally, the group trajectory prediction technique was applied to estimate the future movements of these groups. Both the object detection and group detection methods were applied every five frames to generate these trajectories. The effectiveness of our approach has been validated using several evaluation metrics, including Average Displacement Error (ADE), Final Displacement Error (FDE), Cumulative Matching Characteristics (CMC) scores, IDF1 scores, and IDs, all assessed using the MOT17 dataset. These evaluations not only confirm the practicality and accuracy of our method in predicting pedestrian trajectories but also highlight its efficiency, with a reduction in processing time by 7.6% compared to individual trajectory prediction. This efficiency demonstrates the potential of our method for real-time applications and underscores its capacity to prevent accidents.
引用
收藏
页码:101549 / 101562
页数:14
相关论文
共 64 条
[1]  
[Anonymous], 2022, Phys. A, Stat. Mech. Appl., V593
[2]  
Avidan S., 2022, Cham, P1
[3]  
Bae J.-H., COMPUTER VISION ECCV
[4]  
Bhaskara R, 2023, Arxiv, DOI arXiv:2303.04320
[5]  
Bisagno B., 2018, P EUR C COMP VIS ECC
[6]   Harmonious attention network for person re-identification via complementarity between groups and individuals [J].
Chen, Lin ;
Yang, Hua ;
Xu, Qiling ;
Gao, Zhiyong .
NEUROCOMPUTING, 2021, 453 :766-776
[7]   GDSCAN: Pedestrian Group Detection using Dynamic Epsilon [J].
Chen, Mingzuoyang ;
Banitaan, Shadi ;
Maleki, Mina ;
Li, Yichun .
2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, :1748-1753
[8]  
Chen S., 2023, IEEE Access, V11
[9]  
Chen S., 2022, P IEEE INT C EL INF, P1
[10]  
Cunjun Yu, 2020, Computer Vision - ECCV 2020. 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12357), P507, DOI 10.1007/978-3-030-58610-2_30