Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data

被引:14
作者
Bolboaca, Roland [1 ]
Haller, Piroska [1 ]
机构
[1] George Emil Palade Univ Med Pharm Sci & Technol Ta, Fac Engn & Informat Technol, Targu Mures 540139, Romania
关键词
long short-term memory (LSTM); recurrent neural network (RNN); teacher forcing; prediction; performance analysis; benchmarking; machine learning; Tennessee Eastman process; time series; LSTM; MODEL;
D O I
10.3390/math11061432
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Long short-term memory neural networks have been proposed as a means of creating accurate models from large time series data originating from various fields. These models can further be utilized for prediction, control, or anomaly-detection algorithms. However, finding the optimal hyperparameters to maximize different performance criteria remains a challenge for both novice and experienced users. Hyperparameter optimization algorithms can often be a resource-intensive and time-consuming task, particularly when the impact of the hyperparameters on the performance of the neural network is not comprehended or known. Teacher forcing denotes a procedure that involves feeding the ground truth output from the previous time-step as input to the current time-step during training, while during testing feeding back the predicted values. This paper presents a comprehensive examination of the impact of hyperparameters on long short-term neural networks, with and without teacher forcing, on prediction performance. The study includes testing long short-term memory neural networks, with two variations of teacher forcing, in two prediction modes, using two configurations (i.e., multi-input single-output and multi-input multi-output) on a well-known chemical process simulation dataset. Furthermore, this paper demonstrates the applicability of a long short-term memory neural network with a modified teacher forcing approach in a process state monitoring system. Over 100,000 experiments were conducted with varying hyperparameters and in multiple neural network operation modes, revealing the direct impact of each tested hyperparameter on the training and testing procedures.
引用
收藏
页数:35
相关论文
共 80 条
[1]   Interpretable Deep Learning for Discriminating Pneumonia from Lung Ultrasounds [J].
Abdel-Basset, Mohamed ;
Hawash, Hossam ;
Alnowibet, Khalid Abdulaziz ;
Mohamed, Ali Wagdy ;
Sallam, Karam M. .
MATHEMATICS, 2022, 10 (21)
[2]   Hydrometeorological Drought Forecasting in Hyper-Arid Climates Using Nonlinear Autoregressive Neural Networks [J].
Alsumaiei, Abdullah A. ;
Alrashidi, Mosaed S. .
WATER, 2020, 12 (09)
[3]   A Comparison of Several AI Techniques for Authorship Attribution on Romanian Texts [J].
Avram, Sanda-Maria ;
Oltean, Mihai .
MATHEMATICS, 2022, 10 (23)
[4]   An Application of Non-Linear Autoregressive Neural Networks to Predict Energy Consumption in Public Buildings [J].
Baca Ruiz, Luis Gonzaga ;
Pegalajar Cuellar, Manuel ;
Delgado Calvo-Flores, Miguel ;
Pegalajar Jimenez, Maria Del Carmen .
ENERGIES, 2016, 9 (09)
[5]  
Bengio S, 2015, ADV NEUR IN, V28
[6]  
Bengio Yoshua, 2012, Neural Networks: Tricks of the Trade. Second Edition: LNCS 7700, P437, DOI 10.1007/978-3-642-35289-8_26
[7]   Autoregressive with Exogenous Variables and Neural Network Short-Term Load Forecast Models for Residential Low Voltage Distribution Networks [J].
Bennett, Christopher ;
Stewart, Rodney A. ;
Lu, Junwei .
ENERGIES, 2014, 7 (05) :2938-2960
[8]   Adaptive Ensemble Methods for Tampering Detection in Automotive Aftertreatment Systems [J].
Bolboaca, Roland .
IEEE ACCESS, 2022, 10 :105497-105517
[9]   A Nonlinear Autoregressive Exogenous (NARX) Neural Network Model for the Prediction of the Daily Direct Solar Radiation [J].
Boussaada, Zina ;
Curea, Octavian ;
Remaci, Ahmed ;
Camblong, Haritza ;
Bellaaj, Najiba Mrabet .
ENERGIES, 2018, 11 (03)
[10]   Short-Term Forecasting of Electric Loads Using Nonlinear Autoregressive Artificial Neural Networks with Exogenous Vector Inputs [J].
Buitrago, Jaime ;
Asfour, Shihab .
ENERGIES, 2017, 10 (01)