A novel method for lake level prediction: deep echo state network

被引:12
作者
Alizamir, Meysam [1 ]
Kisi, Ozgur [2 ]
Kim, Sungwon [3 ]
Heddam, Salim [4 ]
机构
[1] Islamic Azad Univ, Hamedan Branch, Dept Civil Engn, Hamadan, Hamadan, Iran
[2] Ilia State Univ, Fac Nat Sci & Engn, Tbilisi, Georgia
[3] Dongyang Univ, Dept Railrd Construct & Safety Engn, Yeongju 36040, South Korea
[4] Fac Sci, Dept Agron, Hydraul Div, Lab Res Biodivers Interact Ecosyst & Biotechnol, Univ 20 Aout 1955,Route el Hadaik, Skikda, BP, Algeria
关键词
Lake level prediction; Deep echo state network; Extreme learning machine; ANNs; Regression tree; EXTREME LEARNING-MACHINE; SUPPORT VECTOR MACHINE; GLOBAL SOLAR-RADIATION; WATER-LEVEL; FEEDFORWARD NETWORKS; MODE DECOMPOSITION; REGRESSION TREE; NEURAL-NETWORK; FLUCTUATIONS; CLASSIFICATION;
D O I
10.1007/s12517-020-05965-9
中图分类号
P [天文学、地球科学];
学科分类号
07 ;
摘要
Accurately prediction of lake level fluctuations is essential for water resources planning and management. In the present study, the potential of a novel method, deep echo state network (Deep ESN), is investigated for monthly lake level prediction and its results are compared with three data-driven methods, artificial neural networks (ANNs), extreme learning machine (ELM), and regression tree (Reg. Tree). The methods are validated using root mean square errors (RMSE), determination coefficient (R-2), and Nash-Sutcliffe efficiency (NSE) criteria. The investigated method (Deep ESN) outperforms the ELM, ANNs, and Reg. Tree by improving accuracies by 61-62-96%, 10-14-84%, and 8-23-80% in prediction 1 month, 2 months, and 3 months ahead lake level fluctuations in terms of RMSE criteria, respectively. Also, accuracy of ELM, ANNs, and Reg. Tree was significantly increased using Deep ESN model by 1.1-1.1-443%, 1.1-1.6-250%, and 1.6-6.5-184% in terms of NSE indicator for different lead-time horizons. Among the ELM, ANNs, and Reg. Tree, the third method provides the worst predictions while the first method performs superior to the second one in all tree time horizons.
引用
收藏
页数:18
相关论文
共 50 条
[31]   Deep Tree Echo State Networks [J].
Gallicchio, Claudio ;
Micheli, Alessio .
2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018, :499-506
[32]   Modular state space of echo state network [J].
Ma, Qian-Li ;
Chen, Wei-Biao .
NEUROCOMPUTING, 2013, 122 :406-417
[33]   Multi-state delayed echo state network with empirical wavelet transform for time series prediction [J].
Yao, Xianshuang ;
Wang, Huiyu ;
Shao, Yanning ;
Huang, Zhanjun ;
Cao, Shengxian ;
Ma, Qingchuan .
APPLIED INTELLIGENCE, 2024, 54 (06) :4646-4667
[34]   Forecasting energy consumption and wind power generation using deep echo state network [J].
Hu, Huanling ;
Wang, Lin ;
Lv, Sheng-Xiang .
RENEWABLE ENERGY, 2020, 154 :598-613
[35]   Advanced Underwater Wireless Optical Communication System Assisted by Deep Echo State Network [J].
Wang, Kexin ;
Gao, Yihong ;
Dragone, Mauro ;
Petillot, Yvan ;
Wang, Xu .
2022 ASIA COMMUNICATIONS AND PHOTONICS CONFERENCE, ACP, 2022, :710-713
[36]   Multivariate Chaotic Time Series Prediction Using a Wavelet Diagonal Echo State Network [J].
Xu, Meiling ;
Han, Min ;
Wang, Jun .
2015 SECOND INTERNATIONAL CONFERENCE ON MATHEMATICS AND COMPUTERS IN SCIENCES AND IN INDUSTRY (MCSI), 2015, :86-92
[37]   Online sequential echo state network with sparse RLS algorithm for time series prediction [J].
Yang, Cuili ;
Qiao, Junfei ;
Ahmad, Zohaib ;
Nie, Kaizhe ;
Wang, Lei .
NEURAL NETWORKS, 2019, 118 :32-42
[38]   A Novel Deep Belief Network and Extreme Learning Machine Based Performance Degradation Prediction Method for Proton Exchange Membrane Fuel Cell [J].
Xie, Yucen ;
Zou, Jianxiao ;
Li, Zhongliang ;
Gao, Fei ;
Peng, Chao .
IEEE ACCESS, 2020, 8 :176661-176675
[39]   Rolling decomposition method in fusion with echo state network for wind speed forecasting [J].
Hu, Huanling ;
Wang, Lin ;
Zhang, Dabin ;
Ling, Liwen .
RENEWABLE ENERGY, 2023, 216
[40]   Genetic algorithm optimized double-reservoir echo state network for multi-regime time series prediction [J].
Zhong, Shisheng ;
Xie, Xiaolong ;
Lin, Lin ;
Wang, Fang .
NEUROCOMPUTING, 2017, 238 :191-204