Robustness of LSTM neural networks for multi-step forecasting of chaotic time series

被引:117
|
作者
Sangiorgio, Matteo [1 ]
Dercole, Fabio [1 ]
机构
[1] Politecn Milan, Dept Elect Informat & Bioengn, Via Ponzio 34-5, I-20133 Milan, Italy
关键词
Deterministic chaos; Recurrent neural networks; Teacher forcing; Exposure bias; Multi-step prediction; Nonlinear time series; DETERMINING EMBEDDING DIMENSION; PREDICTION; SYSTEMS; IDENTIFICATION; ATTRACTORS; ALGORITHM; RECONSTRUCTION; DYNAMICS; MODELS; MAPS;
D O I
10.1016/j.chaos.2020.110045
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Recurrent neurons (and in particular LSTM cells) demonstrated to be efficient when used as basic blocks to build sequence to sequence architectures, which represent the state-of-the-art approach in many sequential tasks related to natural language processing. In this work, these architectures are proposed as general purposes, multi-step predictors for nonlinear time series. We analyze artificial, noise-free data generated by chaotic oscillators and compare LSTM nets with the benchmarks set by feed-forward, onestep-recursive and multi-output predictors. We focus on two different training methods for LSTM nets. The traditional one makes use of the so-called teacher forcing, i.e. the ground truth data are used as input for each time step ahead, rather than the outputs predicted for the previous steps. Conversely, the second feeds the previous predictions back into the recurrent neurons, as it happens when the network is used in forecasting. LSTM predictors robustly show the strengths of the two benchmark competitors, i.e., the good short-term performance of one-step-recursive predictors and greatly improved mid-long-term predictions with respect to feed-forward, multi-output predictors. Training LSTM predictors without teacher forcing is recommended to improve accuracy and robustness, and ensures a more uniform distribution of the predictive power within the chaotic attractor. We also show that LSTM architectures maintain good performances when the number of time lags included in the input differs from the actual embedding dimension of the dataset, a feature that is very important when working on real data. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Multi-Step wind power forecasting model Using LSTM networks, Similar Time Series and LightGBM
    Cao, Yukun
    Gui, Liai
    2018 5TH INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2018, : 192 - 197
  • [2] Multi-step ahead time series forecasting for different data patterns based on LSTM recurrent neural network
    Liu Yunpeng
    Hou Di
    Bao Junpeng
    Qi Yong
    2017 14TH WEB INFORMATION SYSTEMS AND APPLICATIONS CONFERENCE (WISA 2017), 2017, : 305 - 310
  • [3] A multi-step forecasting method of time series
    Zhou, JB
    Wang, YK
    Yang, GY
    Zhao, YL
    13TH CONFERENCE ON PROBABILITY AND STATISTICS IN THE ATMOSPHERIC SCIENCES, 1996, : 361 - 362
  • [4] Adversarial self-attentive time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Lin, Yan
    Wan, Huaiyu
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 231
  • [5] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Bian, Feng
    Wan, Huaiyu
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8737 - 8754
  • [6] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Changxia Gao
    Ning Zhang
    Youru Li
    Feng Bian
    Huaiyu Wan
    Neural Computing and Applications, 2022, 34 : 8737 - 8754
  • [7] Multi-Step Forecasting of Meteorological Time Series Using CNN-LSTM with Decomposition Methods
    Coutinho, Elua Ramos
    Madeira, Jonni G. F.
    Borges, Derick G. F.
    Springer, Marcus V.
    de Oliveira, Elizabeth M.
    Coutinho, Alvaro L. G. A.
    WATER RESOURCES MANAGEMENT, 2025,
  • [8] A dilated convolution network-based LSTM model for multi-step prediction of chaotic time series
    Rongxi Wang
    Caiyuan Peng
    Jianmin Gao
    Zhiyong Gao
    Hongquan Jiang
    Computational and Applied Mathematics, 2020, 39
  • [9] A dilated convolution network-based LSTM model for multi-step prediction of chaotic time series
    Wang, Rongxi
    Peng, Caiyuan
    Gao, Jianmin
    Gao, Zhiyong
    Jiang, Hongquan
    COMPUTATIONAL & APPLIED MATHEMATICS, 2020, 39 (01):
  • [10] Multi-scale adaptive attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Lin, Yan
    Wan, Huaiyu
    APPLIED INTELLIGENCE, 2023, 53 (23) : 28974 - 28993