Spatial-Temporal Graph Attention Gated Recurrent Transformer Network for Traffic Flow Forecasting

被引:8
作者
Wu, Di [1 ,2 ]
Peng, Kai [1 ,2 ]
Wang, Shangguang [3 ]
Leung, Victor C. M. [4 ,5 ]
机构
[1] Huaqiao Univ, Coll Engn, Quanzhou 362021, Peoples R China
[2] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing 210023, Peoples R China
[3] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[4] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[5] Univ British Columbia, Dept Elect & Comp Engn, Vancouver, BC, Canada
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 08期
基金
美国国家科学基金会;
关键词
Graph attention networks (GATs); spatial-temporal dependencies; traffic flow forecasting; transformer; NEURAL-NETWORKS;
D O I
10.1109/JIOT.2023.3340182
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the significant increase in the number of motor vehicles, road-related issues, such as traffic congestion and accidents, have also escalated. The development of an accurate and efficient traffic flow forecasting model is essential for helping car owners plan their journeys. Despite advancements in forecasting models, there are three remaining issues: 1) failing to effectively use cyclical data; 2) failing to adequately capture spatial dependencies; and 3) high-time complexity and memory usage. To tackle the aforementioned challenges, we present a novel spatial-temporal graph attention gated recurrent transformer network (STGAGRTN) for traffic flow forecasting. Specifically, the use of a spatial transformer module allows for the extraction of dynamic spatial dependencies among individual nodes, going beyond the limitation of only considering neighboring nodes. Subsequently, we propose a temporal transformer to extract periodic information from traffic data and capture long-term dependencies. Additionally, we utilize two additional classical techniques to complement the aforementioned modules for extracting characteristics. By incorporating comprehensive spatial-temporal characteristics into our model, we can accurately predict multiple nodes simultaneously. Finally, we have successfully optimized the computational complexity of the transformer module from O (n(2)) to O(n log n). Our model has undergone extensive testing on four authentic data sets, providing compelling evidence of its superior predictive capabilities.
引用
收藏
页码:14267 / 14281
页数:15
相关论文
共 47 条
[1]  
Ahmed A. R., 1979, Transp. Res. Rec., P1
[2]   Traffic transformer: Capturing the continuity and periodicity of time series for traffic forecasting [J].
Cai, Ling ;
Janowicz, Krzysztof ;
Mai, Gengchen ;
Yan, Bo ;
Zhu, Rui .
TRANSACTIONS IN GIS, 2020, 24 (03) :736-755
[3]  
Chen Y., 2023, IEEE T INTELL TRANSP, V24, P8727, DOI DOI 10.1109/TITS.2022.3208952
[4]   Spatio-temporal neural networks for space-time data modeling and relation discovery [J].
Delasalles, Edouard ;
Ziat, Ali ;
Denoyer, Ludovic ;
Gallinari, Patrick .
KNOWLEDGE AND INFORMATION SYSTEMS, 2019, 61 (03) :1241-1267
[5]  
Devlin J, 2019, Arxiv, DOI arXiv:1810.04805
[6]   Hybrid graph convolution neural network and branch-and-bound optimization for traffic flow forecasting [J].
Djenouri, Youcef ;
Belhadi, Asma ;
Srivastava, Gautam ;
Lin, Jerry Chun -Wei .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 139 :100-108
[7]  
Dosovitskiy A, 2021, Arxiv, DOI [arXiv:2010.11929, 10.48550/arXiv.2010.11929, DOI 10.48550/ARXIV.2010.11929]
[8]  
Dou H., 2009, J. Tongji Univ. (Nat. Sci.), V37, P444
[9]  
Fu R, 2016, 2016 31ST YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION (YAC), P324, DOI 10.1109/YAC.2016.7804912
[10]  
Guo SN, 2019, AAAI CONF ARTIF INTE, P922