Computational Efficiency of Multi-Step Learning Echo State Networks for Nonlinear Time Series Prediction

被引:15
作者
Akiyama, Takanori [1 ]
Tanaka, Gouhei [1 ,2 ,3 ]
机构
[1] Univ Tokyo, Grad Sch Informat Sci & Technol, Dept Math Informat, Tokyo 1138656, Japan
[2] Univ Tokyo, Int Res Ctr Neurointelligence, Inst Adv Study, Tokyo 1130033, Japan
[3] Univ Tokyo, Grad Sch Engn, Dept Elect Engn & Informat Syst, Tokyo 1138656, Japan
基金
日本学术振兴会; 日本科学技术振兴机构;
关键词
Reservoirs; Training; Time series analysis; Computational modeling; Standards; Task analysis; Predictive models; Reservoir computing; time series prediction; nonlinear dynamical systems; linear regression; computational cost;
D O I
10.1109/ACCESS.2022.3158755
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The echo state network (ESN) is a representative model for reservoir computing, which has been mainly used for temporal pattern recognition. Recent studies have shown that multi-reservoir ESN models constructed with multiple reservoirs can enhance the potential of the ESN-based approach. In the present study, we investigate computational performance and efficiency of the multi-step learning ESN which is one of the multi-reservoir ESN models and characterized by step-by-step learning processes. We show that the time complexity of the training algorithm of the multi-step learning ESN is equal to or smaller than that of the standard ESN. Our numerical experiments demonstrate that the multi-step learning ESN can achieve better or comparable performance with much less computational time compared to the standard ESN in nonlinear time series prediction tasks. Moreover, we reveal how the model architecture of the multi-step learning ESN is effective in comparison with other possible variant models. The step-by-step learning is applicable to general multi-reservoir systems and hardware for enhancement of their computational ability and efficiency.
引用
收藏
页码:28535 / 28544
页数:10
相关论文
共 31 条
[21]   Efficient Cross-Validation of Echo State Networks [J].
Lukosevicius, Mantas ;
Uselis, Arnas .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: WORKSHOP AND SPECIAL SESSIONS, 2019, 11731 :121-133
[22]   Reservoir computing approaches to recurrent neural network training [J].
Lukosevicius, Mantas ;
Jaeger, Herbert .
COMPUTER SCIENCE REVIEW, 2009, 3 (03) :127-149
[23]   Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model [J].
Pathak, Jaideep ;
Wikner, Alexander ;
Fussell, Rebeckah ;
Chandra, Sarthak ;
Hunt, Brian R. ;
Girvan, Michelle ;
Ott, Edward .
CHAOS, 2018, 28 (04)
[24]   Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach [J].
Pathak, Jaideep ;
Hunt, Brian ;
Girvan, Michelle ;
Lu, Zhixin ;
Ott, Edward .
PHYSICAL REVIEW LETTERS, 2018, 120 (02)
[25]   Minimum Complexity Echo State Network [J].
Rodan, Ali ;
Tino, Peter .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (01) :131-144
[26]   Randomness in neural networks: an overview [J].
Scardapane, Simone ;
Wang, Dianhui .
WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2017, 7 (02)
[27]   Design Strategies for Weight Matrices of Echo State Networks [J].
Strauss, Tobias ;
Wustlich, Welf ;
Labahn, Roger .
NEURAL COMPUTATION, 2012, 24 (12) :3246-3276
[28]   Recent advances in physical reservoir computing: A review [J].
Tanaka, Gouhei ;
Yamane, Toshiyuki ;
Heroux, Jean Benoit ;
Nakane, Ryosho ;
Kanazawa, Naoki ;
Takeda, Seiji ;
Numata, Hidetoshi ;
Nakano, Daiju ;
Hirose, Akira .
NEURAL NETWORKS, 2019, 115 :100-123
[29]  
TIKHONOV AN, 1963, DOKL AKAD NAUK SSSR+, V151, P501
[30]   Stochastic Configuration Networks: Fundamentals and Algorithms [J].
Wang, Dianhui ;
Li, Ming .
IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (10) :3466-3479