Efficient Implementations of Echo State Network Cross-Validation

被引:18
作者
Lukosevicius, Mantas [1 ]
Uselis, Arnas [1 ]
机构
[1] Kaunas Univ Technol, Studentu St 50-406, LT-51368 Kaunas, Lithuania
关键词
Echo state networks; Reservoir computing; Recurrent neural networks; Cross-validation; Time complexity;
D O I
10.1007/s12559-021-09849-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cross-Validation (CV) is still uncommon in time series modeling. Echo State Networks (ESNs), as a prime example of Reservoir Computing (RC) models, are known for their fast and precise one-shot learning, that often benefit from good hyper-parameter tuning. This makes them ideal to change the status quo. We discuss CV of time series for predicting a concrete time interval of interest, suggest several schemes for cross-validating ESNs and introduce an efficient algorithm for implementing them. This algorithm is presented as two levels of optimizations of doing k-fold CV. Training an RC model typically consists of two stages: (i) running the reservoir with the data and (ii) computing the optimal readouts. The first level of our optimization addresses the most computationally expensive part (i) and makes it remain constant irrespective of k. It dramatically reduces reservoir computations in any type of RC system and is enough if k is small. The second level of optimization also makes the (ii) part remain constant irrespective of large k, as long as the dimension of the output is low. We discuss when the proposed validation schemes for ESNs could be beneficial, three options for producing the final model and empirically investigate them on six different real-world datasets, as well as do empirical computation time experiments. We provide the code in an online repository. Proposed CV schemes give better and more stable test performance in all the six different real-world datasets, three task types. Empirical run times confirm our complexity analysis. In most situations, k-fold CV of ESNs and many other RC models can be done for virtually the same time and space complexity as a simple single-split validation. This enables CV to become a standard practice in RC.
引用
收藏
页码:1470 / 1484
页数:15
相关论文
共 26 条
  • [11] Jaeger H, 2001, TECH REP GMD REPORT
  • [12] Optimization and applications of echo state networks with leaky-integrator neurons
    Jaegera, Herbert
    Lukosevicius, Mantas
    Popovici, Dan
    Siewert, Udo
    [J]. NEURAL NETWORKS, 2007, 20 (03) : 335 - 352
  • [13] Lukosevicius Mantas, 2012, Neural Networks: Tricks of the Trade. Second Edition: LNCS 7700, P659, DOI 10.1007/978-3-642-35289-8_36
  • [14] Efficient Cross-Validation of Echo State Networks
    Lukosevicius, Mantas
    Uselis, Arnas
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: WORKSHOP AND SPECIAL SESSIONS, 2019, 11731 : 121 - 133
  • [15] Reservoir computing approaches to recurrent neural network training
    Lukosevicius, Mantas
    Jaeger, Herbert
    [J]. COMPUTER SCIENCE REVIEW, 2009, 3 (03) : 127 - 149
  • [16] The impact of the MIT-BIH arrhythmia database
    Moody, GA
    Mark, RG
    [J]. IEEE ENGINEERING IN MEDICINE AND BIOLOGY MAGAZINE, 2001, 20 (03): : 45 - 50
  • [17] Pascanu R., 2013, INT C MACH LEARN, P1310
  • [18] Consistent cross-validatory model-selection for dependent data:: hv-block cross-validation
    Racine, J
    [J]. JOURNAL OF ECONOMETRICS, 2000, 99 (01) : 39 - 61
  • [19] ROZELOT JP, 1995, ASTRON ASTROPHYS, V297, pL45
  • [20] An Efficient Leave-One-Out Cross-Validation-Based Extreme Learning Machine (ELOO-ELM) With Minimal User Intervention
    Shao, Zhifei
    Er, Meng Joo
    Wang, Ning
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (08) : 1939 - 1951