Computational Efficiency of Multi-Step Learning Echo State Networks for Nonlinear Time Series Prediction

被引:15
作者
Akiyama, Takanori [1 ]
Tanaka, Gouhei [1 ,2 ,3 ]
机构
[1] Univ Tokyo, Grad Sch Informat Sci & Technol, Dept Math Informat, Tokyo 1138656, Japan
[2] Univ Tokyo, Int Res Ctr Neurointelligence, Inst Adv Study, Tokyo 1130033, Japan
[3] Univ Tokyo, Grad Sch Engn, Dept Elect Engn & Informat Syst, Tokyo 1138656, Japan
基金
日本学术振兴会; 日本科学技术振兴机构;
关键词
Reservoirs; Training; Time series analysis; Computational modeling; Standards; Task analysis; Predictive models; Reservoir computing; time series prediction; nonlinear dynamical systems; linear regression; computational cost;
D O I
10.1109/ACCESS.2022.3158755
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The echo state network (ESN) is a representative model for reservoir computing, which has been mainly used for temporal pattern recognition. Recent studies have shown that multi-reservoir ESN models constructed with multiple reservoirs can enhance the potential of the ESN-based approach. In the present study, we investigate computational performance and efficiency of the multi-step learning ESN which is one of the multi-reservoir ESN models and characterized by step-by-step learning processes. We show that the time complexity of the training algorithm of the multi-step learning ESN is equal to or smaller than that of the standard ESN. Our numerical experiments demonstrate that the multi-step learning ESN can achieve better or comparable performance with much less computational time compared to the standard ESN in nonlinear time series prediction tasks. Moreover, we reveal how the model architecture of the multi-step learning ESN is effective in comparison with other possible variant models. The step-by-step learning is applicable to general multi-reservoir systems and hardware for enhancement of their computational ability and efficiency.
引用
收藏
页码:28535 / 28544
页数:10
相关论文
共 31 条
[1]  
Akiyama T, 2019, IEEE IJCNN
[2]  
[Anonymous], 2017, ARXIV171105255
[3]  
[Anonymous], 2010, Adv. Neural Inform. Process. Syst.
[4]   Using a reservoir computer to learn chaotic attractors, with applications to chaos synchronization and cryptography [J].
Antonik, Piotr ;
Gulina, Marvyn ;
Pauwels, Jael ;
Massar, Serge .
PHYSICAL REVIEW E, 2018, 98 (01)
[5]   Data-driven predictions of a multiscale Lorenz 96 chaotic system using machine-learning methods: reservoir computing, artificial neural network, and long short-term memory network [J].
Chattopadhyay, Ashesh ;
Hassanzadeh, Pedram ;
Subramanian, Devika .
NONLINEAR PROCESSES IN GEOPHYSICS, 2020, 27 (03) :373-389
[6]  
Erdos P., 1960, B INT STATIST INST, V5, P17
[7]   Local Lyapunov exponents of deep echo state networks [J].
Gallicchio, Claudio ;
Micheli, Alessio ;
Silvestri, Luca .
NEUROCOMPUTING, 2018, 298 :34-45
[8]   Deep reservoir computing: A critical experimental analysis [J].
Gallicchio, Claudio ;
Micheli, Alessio ;
Pedrelli, Luca .
NEUROCOMPUTING, 2017, 268 :87-99
[9]  
Goudarzi A, 2015, 2015 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE FOR SECURITY AND DEFENSE APPLICATIONS (CISDA), P119
[10]   Echo state networks are universal [J].
Grigoryeva, Lyudmila ;
Ortega, Juan-Pablo .
NEURAL NETWORKS, 2018, 108 :495-508