Spatial-Temporal Graph Attention Gated Recurrent Transformer Network for Traffic Flow Forecasting

被引:5
|
作者
Wu, Di [1 ,2 ]
Peng, Kai [1 ,2 ]
Wang, Shangguang [3 ]
Leung, Victor C. M. [4 ,5 ]
机构
[1] Huaqiao Univ, Coll Engn, Quanzhou 362021, Peoples R China
[2] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing 210023, Peoples R China
[3] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[4] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[5] Univ British Columbia, Dept Elect & Comp Engn, Vancouver, BC, Canada
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 08期
基金
美国国家科学基金会;
关键词
Graph attention networks (GATs); spatial-temporal dependencies; traffic flow forecasting; transformer; NEURAL-NETWORKS;
D O I
10.1109/JIOT.2023.3340182
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the significant increase in the number of motor vehicles, road-related issues, such as traffic congestion and accidents, have also escalated. The development of an accurate and efficient traffic flow forecasting model is essential for helping car owners plan their journeys. Despite advancements in forecasting models, there are three remaining issues: 1) failing to effectively use cyclical data; 2) failing to adequately capture spatial dependencies; and 3) high-time complexity and memory usage. To tackle the aforementioned challenges, we present a novel spatial-temporal graph attention gated recurrent transformer network (STGAGRTN) for traffic flow forecasting. Specifically, the use of a spatial transformer module allows for the extraction of dynamic spatial dependencies among individual nodes, going beyond the limitation of only considering neighboring nodes. Subsequently, we propose a temporal transformer to extract periodic information from traffic data and capture long-term dependencies. Additionally, we utilize two additional classical techniques to complement the aforementioned modules for extracting characteristics. By incorporating comprehensive spatial-temporal characteristics into our model, we can accurately predict multiple nodes simultaneously. Finally, we have successfully optimized the computational complexity of the transformer module from O (n(2)) to O(n log n). Our model has undergone extensive testing on four authentic data sets, providing compelling evidence of its superior predictive capabilities.
引用
收藏
页码:14267 / 14281
页数:15
相关论文
共 50 条
  • [31] DGTNet:dynamic graph attention transformer network for traffic flow forecasting
    Chen, Jing
    Li, Wuzhi
    Chen, Shuixuan
    Zhang, Guowei
    ENGINEERING RESEARCH EXPRESS, 2024, 6 (04):
  • [32] Traffic Flow Forecasting Based on Transformer with Diffusion Graph Attention Network
    Zhang, Hong
    Wang, Hongyan
    Chen, Linlong
    Zhao, Tianxin
    Kan, Sunan
    INTERNATIONAL JOURNAL OF AUTOMOTIVE TECHNOLOGY, 2024, 25 (03) : 455 - 468
  • [33] Orthogonal Spatial-Temporal Graph Convolutional Networks for Traffic Flow Forecasting
    Fei, Yanhong
    Hu, Ming
    Wei, Xian
    Chen, Mingsong
    2022 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2022, : 71 - 76
  • [34] Hybrid Spatial-Temporal Graph Convolutional Network for Long-Term Traffic Flow Forecasting
    Wu, Zihao
    Lou, Ping
    2023 IEEE 8TH INTERNATIONAL CONFERENCE ON BIG DATA ANALYTICS, ICBDA, 2023, : 224 - 229
  • [35] An improved dynamic Chebyshev graph convolution network for traffic flow prediction with spatial-temporal attention
    Liao, Lyuchao
    Hu, Zhiyuan
    Zheng, Yuxin
    Bi, Shuoben
    Zou, Fumin
    Qiu, Huai
    Zhang, Maolin
    APPLIED INTELLIGENCE, 2022, 52 (14) : 16104 - 16116
  • [36] A traffic flow forecasting method based on hybrid spatial-temporal gated convolution
    Zhang, Ying
    Yang, Songhao
    Wang, Hongchao
    Cheng, Yongqiang
    Wang, Jinyu
    Cao, Liping
    An, Ziying
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2025, 16 (03) : 1805 - 1817
  • [37] JointGraph: joint pre-training framework for traffic forecasting with spatial-temporal gating diffusion graph attention network
    Kong, Xiangyuan
    Wei, Xiang
    Zhang, Jian
    Xing, Weiwei
    Lu, Wei
    APPLIED INTELLIGENCE, 2023, 53 (11) : 13723 - 13740
  • [38] uTransformer: unified spatial-temporal transformer with external factors for traffic flow forecasting
    Li, Junyan
    Dong, Wenyong
    Gui, Xuewen
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (01)
  • [39] An Optimized Temporal-Spatial Gated Graph Convolution Network for Traffic Forecasting
    Guo, Kan
    Hu, Yongli
    Sun, Yanfeng
    Qian, Zhen
    Gao, Junbin
    Yin, Baocai
    IEEE INTELLIGENT TRANSPORTATION SYSTEMS MAGAZINE, 2022, 14 (01) : 153 - 162
  • [40] Transformer network with decoupled spatial–temporal embedding for traffic flow forecasting
    Wei Sun
    Rongzhang Cheng
    Yingqi Jiao
    Junbo Gao
    Zhedian Zheng
    Nan Lu
    Applied Intelligence, 2023, 53 : 30148 - 30168