Trajectory Unified Transformer for Pedestrian Trajectory Prediction

被引:19
作者
Shi, Liushuai [1 ]
Wang, Le [1 ]
Zhou, Sanping [1 ]
Hua, Gang [2 ]
机构
[1] Xi An Jiao Tong Univ, Inst Artificial Intelligence & Robot, Natl Engn Res Ctr Visual Informat & Applicat, Natl Key Lab Human Machine Hybrid Augmented Intel, Xian, Peoples R China
[2] Wormpex AI Res, Bellevue, WA USA
来源
2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023) | 2023年
基金
国家重点研发计划;
关键词
D O I
10.1109/ICCV51070.2023.00887
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pedestrian trajectory prediction is an essential link to understanding human behavior. Recent work achieves state-of-the-art performance gained from hand-designed post-processing, e.g., clustering. However, this post-processing suffers from expensive inference time and neglects the probability that the predicted trajectory disturbs downstream safety decisions. In this paper, we present Trajectory Unified TRansformer, called TUTR, which unifies the trajectory prediction components, social interaction, and multimodal trajectory prediction, into a transformer encoder-decoder architecture to effectively remove the need for post-processing. Specifically, TUTR parses the relationships across various motion modes using an explicit global prediction and an implicit mode-level transformer encoder. Then, TUTR attends to the social interactions with neighbors by a social-level transformer decoder. Finally, a dual prediction forecasts diverse trajectories and corresponding probabilities in parallel without post-processing. TUTR achieves state-of-the-art accuracy performance and improvements in inference speed of about 10x - 40x compared to previous well-tuned state-of-the-art methods using post-processing.
引用
收藏
页码:9641 / 9650
页数:10
相关论文
共 50 条
  • [21] Holistic LSTM for Pedestrian Trajectory Prediction
    Quan, Ruijie
    Zhu, Linchao
    Wu, Yu
    Yang, Yi
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 3229 - 3239
  • [22] Pedestrian trajectory prediction method based on pedestrian pose
    Wang R.
    Song X.
    Chen K.
    Gong K.
    Zhang J.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2023, 49 (07): : 1743 - 1754
  • [23] Aircraft Trajectory Prediction With Inverted Transformer
    Yoon, Seokbin
    Lee, Keumjin
    IEEE ACCESS, 2025, 13 : 26318 - 26330
  • [24] Non-probability sampling network based on anomaly pedestrian trajectory discrimination for pedestrian trajectory prediction
    Liu, Quankai
    Sang, Haifeng
    Wang, Jinyu
    Chen, Wangxing
    Liu, Yulong
    IMAGE AND VISION COMPUTING, 2024, 143
  • [25] Early Prediction of a Pedestrian's Trajectory at Intersections
    Goldhammer, Michael
    Gerhard, Matthias
    Zernetsch, Stefan
    Doll, Konrad
    Brunsmann, Ulrich
    2013 16TH INTERNATIONAL IEEE CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS - (ITSC), 2013, : 237 - 242
  • [26] Sparse Pedestrian Character Learning for Trajectory Prediction
    Dong, Yonghao
    Wang, Le
    Zhou, Sanping
    Hua, Gang
    Sun, Changyin
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 11070 - 11082
  • [27] Multi-Camera Trajectory Forecasting: Pedestrian Trajectory Prediction in a Network of Cameras
    Styles, Olly
    Guha, Tanaya
    Sanchez, Victor
    Kot, Alex
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 4379 - 4382
  • [28] Pedestrian Trajectory Prediction in Extremely Crowded Scenarios
    Shi, Xiaodan
    Shao, Xiaowei
    Guo, Zhiling
    Wu, Guangming
    Zhang, Haoran
    Shibasaki, Ryosuke
    SENSORS, 2019, 19 (05)
  • [29] Step Attention: Sequential Pedestrian Trajectory Prediction
    Zhang, Ethan
    Masoud, Neda
    Bandegi, Mahdi
    Lull, Joseph
    Malhan, Rajesh K.
    IEEE SENSORS JOURNAL, 2022, 22 (08) : 8071 - 8083
  • [30] Stochastic Sampling Simulation for Pedestrian Trajectory Prediction
    Anderson, Cyrus
    Du, Xiaoxiao
    Vasudevan, Ram
    Johnson-Roberson, Matthew
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 4236 - 4243