A Combined Model for Multivariate Time Series Forecasting Based on MLP-Feedforward Attention-LSTM

被引:10
作者
Liu, Yuntong [1 ]
Zhao, Chunna [1 ]
Huang, Yaqun [1 ]
机构
[1] Yunnan Univ, Sch Informat Sci & Engn, Kunming 650504, Yunnan, Peoples R China
基金
中国国家自然科学基金;
关键词
Time series analysis; Feature extraction; Recurrent neural networks; Predictive models; Hidden Markov models; Logic gates; Forecasting; Multivariate time series; multilayer perceptron; feed-forward attention mechanism; long short-term memory network;
D O I
10.1109/ACCESS.2022.3192430
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multivariate time series forecasting has very great practical significance for a long time, and it has been attracting the attention of researchers from a diverse range of fields. However, it is difficult to analyze the relationship and transformation law among multivariate data. Further, it is hard to obtain a relatively accurate prediction. In recent years, long short-term memory (LSTM) has shown high capability in dealing with nonlinearity and long memory of time series data. Although LSTM can also process multivariate data, it is insufficient to pay various degrees of attention to multivariate data. To address this issue, a multivariate time series prediction model based on multilayer perceptron (MLP), feed-forward attention mechanism, and LSTM is proposed in this paper. Firstly, the simulation process utilizes the MLP module to map the multivariate initial sequences into another latent dimensional space, thereby obtaining easily captured mapping features. Then, these features are adaptively assigned attention weights through the feed-forward attention mechanism. Finally, the LSTM module uses these feature sequences with attention weights to make final predictions. The experimental results show that the method that combines the MLP layer with the feed-forward attention layer is effective in extracting multivariate features. Also, the empirical results indicate that our proposed framework (a combined model of MLP-Feedforward attention-LSTM) can achieve better performance than baselines.
引用
收藏
页码:88644 / 88654
页数:11
相关论文
共 30 条
[1]  
[Anonymous], 2015, P 3 INT C LEARN REPR
[2]  
[Anonymous], INT C LEARNING REPRE
[3]  
Box G. E. P., 1970, Time series analysis, forecasting and control
[4]   Fractional Neuro-Sequential ARFIMA-LSTM for Financial Market Forecasting [J].
Bukhari, Ayaz Hussain ;
Raja, Muhammad Asif Zahoor ;
Sulaiman, Muhammad ;
Islam, Saeed ;
Shoaib, Muhammad ;
Kumam, Poom .
IEEE ACCESS, 2020, 8 :71326-71338
[5]  
Chung JY, 2014, Arxiv, DOI [arXiv:1412.3555, 10.48550/arXiv.1412.3555]
[6]   FA-LSTM: A Novel Toxic Gas Concentration Prediction Model in Pollutant Environment [J].
Cong, Yu ;
Zhao, Ximeng ;
Tang, Ke ;
Wang, Ge ;
Hu, Yanfei ;
Jiao, Yingkui .
IEEE ACCESS, 2022, 10 :1591-1602
[7]   FINDING STRUCTURE IN TIME [J].
ELMAN, JL .
COGNITIVE SCIENCE, 1990, 14 (02) :179-211
[8]  
Elsworth S, 2020, Arxiv, DOI arXiv:2003.05672
[9]  
Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.8.1735, 10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
[10]  
Hsu C.-W., 2003, TECH REP, DOI [10 . 1177 / 02632760022050997, DOI 10.1177/02632760022050997]