MSASGCN : Multi-Head Self-Attention Spatiotemporal Graph Convolutional Network for Traffic Flow Forecasting

被引:2
|
作者
Cao, Yang [1 ]
Liu, Detian [1 ]
Yin, Qizheng [1 ]
Xue, Fei [1 ]
Tang, Hengliang [1 ]
机构
[1] Beijing Wuzi Univ, Sch Informat, Beijing 101149, Peoples R China
关键词
NEURAL-NETWORK; PREDICTION;
D O I
10.1155/2022/2811961
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Traffic flow forecasting is an essential task of an intelligent transportation system (ITS), closely related to intelligent transportation management and resource scheduling. Dynamic spatial-temporal dependencies in traffic data make traffic flow forecasting to be a challenging task. Most existing research cannot model dynamic spatial and temporal correlations to achieve well-forecasting performance. The multi-head self-attention mechanism is a valuable method to capture dynamic spatial-temporal correlations, and combining it with graph convolutional networks is a promising solution. Therefore, we propose a multi-head self-attention spatiotemporal graph convolutional network (MSASGCN) model. It can effectively capture local correlations and potential global correlations of spatial structures, can handle dynamic evolution of the road network, and, in the time dimension, can effectively capture dynamic temporal correlations. Experiments on two real datasets verify the stability of our proposed model, obtaining a better prediction performance than the baseline algorithms. The correlation metrics get significantly reduced compared with traditional time series prediction methods and deep learning methods without using graph neural networks, according to MAE and RMSE results. Compared with advanced traffic flow forecasting methods, our model also has a performance improvement and a more stable prediction performance. We also discuss some problems and challenges in traffic forecasting.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Spatiotemporal Residual Graph Attention Network for Traffic Flow Forecasting
    Zhang, Qingyong
    Li, Changwu
    Su, Fuwen
    Li, Yuanzheng
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (13) : 11518 - 11532
  • [2] Spatiotemporal Graph Convolutional Network for Multi-Scale Traffic Forecasting
    Wang, Yi
    Jing, Changfeng
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2022, 11 (02)
  • [3] ADDGCN: A Novel Approach with Down-Sampling Dynamic Graph Convolution and Multi-Head Attention for Traffic Flow Forecasting
    Li, Zuhua
    Wei, Siwei
    Wang, Haibo
    Wang, Chunzhi
    APPLIED SCIENCES-BASEL, 2024, 14 (10):
  • [4] Masked multi-head self-attention for causal speech enhancement
    Nicolson, Aaron
    Paliwal, Kuldip K.
    SPEECH COMMUNICATION, 2020, 125 : 80 - 96
  • [5] Attention based spatiotemporal graph attention networks for traffic flow forecasting
    Wang, Yi
    Jing, Changfeng
    Xu, Shishuo
    Guo, Tao
    INFORMATION SCIENCES, 2022, 607 : 869 - 883
  • [6] DDP-GCN: Multi-graph convolutional network for spatiotemporal traffic forecasting
    Lee, Kyungeun
    Rhee, Wonjong
    TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2022, 134
  • [7] A combined traffic flow forecasting model based on graph convolutional network and attention mechanism
    Zhang, Hong
    Chen, Linlong
    Cao, Jie
    Zhang, Xijun
    Kan, Sunan
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2021, 32 (12):
  • [8] An integrated multi-head dual sparse self-attention network for remaining useful life prediction
    Zhang, Jiusi
    Li, Xiang
    Tian, Jilun
    Luo, Hao
    Yin, Shen
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2023, 233
  • [9] Graph attention temporal convolutional network for traffic speed forecasting on road networks
    Zhang, Ke
    He, Fang
    Zhang, Zhengchao
    Lin, Xi
    Li, Meng
    TRANSPORTMETRICA B-TRANSPORT DYNAMICS, 2021, 9 (01) : 153 - 171
  • [10] SMGformer: integrating STL and multi-head self-attention in deep learning model for multi-step runoff forecasting
    Wang, Wen-chuan
    Gu, Miao
    Hong, Yang-hao
    Hu, Xiao-xue
    Zang, Hong-fei
    Chen, Xiao-nan
    Jin, Yan-guo
    SCIENTIFIC REPORTS, 2024, 14 (01):