An LSTM based Encoder-Decoder Model for Multi-Step Traffic Flow Prediction

被引:0
作者
Du, Shengdong [1 ]
Li, Tianrui [1 ]
Yang, Yan [1 ]
Gong, Xun [1 ]
Homg, Shi-Jinn [2 ]
机构
[1] Southwest Jiaotong Univ, Natl Engn Lab Integrated Transportat Big Data App, Sch Informat Sci & Technol, Chengdu, Peoples R China
[2] Natl Taiwan Univ Sci & Technol, Dept Comp Sci & Informat Engn, Taipei, Taiwan
来源
2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2019年
基金
中国国家自然科学基金;
关键词
traffic flow prediction; long short-term memory networks; encoder-decoder; temporal attention mechanism; NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traffic flow prediction has been regarded as a key research problem in the intelligent transportation system. In this paper, we propose an encoder-decoder model with temporal attention mechanism for multi-step forward traffic flow prediction task, which uses LSTM as the encoder and decoder to learn the long dependencies features and nonlinear characteristics of multivariate traffic flow related time series data, and also introduces a temporal attention mechanism for more accurately traffic flow prediction. Through the real traffic flow dataset experiments, it has shown that the proposed model has better prediction ability than classic shallow learning and baseline deep learning models. And the predicted traffic flow value can be well matched with the ground truth value not only under short step forward prediction condition but also under longer step forward prediction condition, which validates that the proposed model is a good option for dealing with the realtime and forward-looking problems of traffic flow prediction task.
引用
收藏
页数:8
相关论文
共 50 条
[31]   LSTM enhanced by dual-attention-based encoder-decoder for daily peak load forecasting [J].
Zhu, Kedong ;
Li, Yaping ;
Mao, Wenbo ;
Li, Feng ;
Yan, Jiahao .
ELECTRIC POWER SYSTEMS RESEARCH, 2022, 208
[32]   Encoder-decoder based process generation method [J].
Tang W. ;
Wang P. ;
Cai D. ;
Zhang G. ;
Wang Y. .
Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2023, 29 (11) :3656-3668
[33]   Dynamic-attention based Encoder-decoder model for Speaker Extraction with Anchor speech [J].
Li, Hao ;
Zhang, Xueliang ;
Gao, Guanglai .
2019 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2019, :297-301
[34]   Encoder-Decoder Based Route Generation Model for Flexible Travel Recommendation [J].
Zhang, Jiale ;
Ma, Mingqian ;
Gao, Xiaofeng ;
Chen, Guihai .
IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (03) :905-920
[35]   Self-Attention based encoder-Decoder for multistep human density prediction [J].
Violos, John ;
Theodoropoulos, Theodoros ;
Maroudis, Angelos-Christos ;
Leivadeas, Aris ;
Tserpes, Konstantinos .
JOURNAL OF URBAN MOBILITY, 2022, 2
[36]   Automatic Generation of Chinese Couplets with Attention Based Encoder-Decoder Model [J].
Yuan, Shengqiong ;
Zhong, Luo ;
Li, Lin ;
Zhang, Rui .
2019 2ND IEEE CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2019), 2019, :65-70
[37]   ED-ACNN: Novel attention convolutional neural network based on encoder-decoder framework for human traffic prediction [J].
Pu, Bin ;
Liu, Yuan ;
Zhu, Ningbo ;
Li, Kenli ;
Li, Keqin .
APPLIED SOFT COMPUTING, 2020, 97
[38]   Development and implementation of real-time anomaly detection on tool wear based on stacked LSTM encoder-decoder model [J].
Oshida, Taisuke ;
Murakoshi, Tomohiro ;
Zhou, Libo ;
Ojima, Hirotaka ;
Kaneko, Kazuki ;
Onuki, Teppei ;
Shimizu, Jun .
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2023, 127 (1-2) :263-278
[39]   On-line Multi-step Prediction of Short Term Traffic Flow Based on GRU Neural Network [J].
Guo, Jingyan ;
Wang, Zijun ;
Chen, Huawei .
IIP'17: PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON INTELLIGENT INFORMATION PROCESSING, 2017,
[40]   LSTM Encoder-Decoder方法预测设备剩余使用寿命 [J].
赵志宏 ;
李晴 ;
李乐豪 ;
赵敬娇 .
交通运输工程学报, 2021, 21 (06) :269-277