Training Dynamic Neural Networks Using the Extended Kalman Filter for Multi-Step-Ahead Predictions

被引:1
|
作者
Chernodub, Artem [1 ]
机构
[1] Inst Math Machines & Syst NASU, Neurotechnol Dept, Glushkova 42 Ave, UA-03187 Kiev, Ukraine
来源
关键词
multi-step-ahead prediction; mini-batch Extended Kalman Filter; Forecasted Propagation Through Time; Backpropagation Through Time;
D O I
10.1007/978-3-319-09903-3_11
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper is dedicated to single-step-ahead and multi-step-ahead time series prediction problems. We consider feedforward and recurrent neural network architectures, different derivatives calculation and optimization methods and analyze their advantages and disadvantages. We propose a novel method for training feedforward neural networks with tapped delay lines for better multi-step-ahead predictions. Special mini-batch calculations of derivatives called Forecasted Propagation Through Time for the Extended Kalman Filter training method are introduced. Experiments on well-known benchmark time series are presented.
引用
收藏
页码:221 / 243
页数:23
相关论文
共 50 条
  • [41] Estimation of unmeasured inputs using recurrent neural networks and the extended Kalman filter
    Habtom, R
    Litz, L
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 2067 - 2071
  • [42] Enhanced SLAM for a mobile robot using extended Kalman Filter and neural networks
    Kyung-Sik Choi
    Suk-Gyu Lee
    International Journal of Precision Engineering and Manufacturing, 2010, 11 : 255 - 264
  • [43] Enhanced SLAM for a Mobile Robot using Extended Kalman Filter and Neural Networks
    Choi, Kyung-Sik
    Lee, Suk-Gyu
    INTERNATIONAL JOURNAL OF PRECISION ENGINEERING AND MANUFACTURING, 2010, 11 (02) : 255 - 264
  • [44] Multi-step-ahead host load prediction using autoencoder and echo state networks in cloud computing
    Qiangpeng Yang
    Yu Zhou
    Yao Yu
    Jie Yuan
    Xianglei Xing
    Sidan Du
    The Journal of Supercomputing, 2015, 71 : 3037 - 3053
  • [45] Multi-step-ahead host load prediction using autoencoder and echo state networks in cloud computing
    Yang, Qiangpeng
    Zhou, Yu
    Yu, Yao
    Yuan, Jie
    Xing, Xianglei
    Du, Sidan
    JOURNAL OF SUPERCOMPUTING, 2015, 71 (08): : 3037 - 3053
  • [46] Soft Sensors for Industrial Processes Using Multi-Step-Ahead Hankel Dynamic Mode Decomposition with Control
    Patane, Luca
    Sapuppo, Francesca
    Xibilia, Maria Gabriella
    ELECTRONICS, 2024, 13 (15)
  • [47] Dynamic Ensemble Using Previous and Predicted Future Performance for Multi-step-ahead Solar Power Forecasting
    Koprinska, Irena
    Rana, Mashud
    Rahman, Ashfaqur
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: TEXT AND TIME SERIES, PT IV, 2019, 11730 : 436 - 449
  • [48] A dynamic factor machine learning method for multi-variate and multi-step-ahead forecasting
    1600, Institute of Electrical and Electronics Engineers Inc., United States (2018-January):
  • [49] A dynamic factor machine learning method for multi-variate and multi-step-ahead forecasting
    Bontempi, Gianluca
    Le Borgne, Yann-ael
    De Stefani, Jacopo
    2017 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2017, : 222 - 231
  • [50] The adaptive learning rates of extended Kalman filter based training algorithm for wavelet neural networks
    Kim, Kyoung Joo
    Park, Jin Bae
    Choi, Yoon Ho
    MICAI 2006: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2006, 4293 : 327 - +