Discrete-Time Signatures and Randomness in Reservoir Computing

被引:22
作者
Cuchiero, Christa [1 ]
Gonon, Lukas [2 ]
Grigoryeva, Lyudmila [3 ]
Ortega, Juan-Pablo [4 ]
Teichmann, Josef [5 ]
机构
[1] Univ Vienna, Dept Stat & Operat Res, A-1010 Vienna, Austria
[2] Ludwig Maximilians Univ Munchen, LMU Math Inst, D-80336 Munich, Germany
[3] Univ Konstanz, Dept Math & Stat, D-78464 Constance, Germany
[4] Nanyang Technol Univ, Singapore, Singapore
[5] Swiss Fed Inst Technol, CH-8092 Zurich, Switzerland
基金
瑞士国家科学基金会;
关键词
Reservoirs; Synthetic aperture sonar; Mathematical model; Machine learning; Training; Recurrent neural networks; Differential equations; Echo state network (ESN); Johnson-Lindenstrauss (JL) lemma; machine learning; recurrent neural network (RNN); reservoir computing (RC); signature state-affine system (SigSAS); state-affine system (SAS); Volterra series; ECHO STATE NETWORKS; FADING-MEMORY; JOHNSON-LINDENSTRAUSS; SYSTEMS;
D O I
10.1109/TNNLS.2021.3076777
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A new explanation of the geometric nature of the reservoir computing (RC) phenomenon is presented. RC is understood in the literature as the possibility of approximating input-output systems with randomly chosen recurrent neural systems and a trained linear readout layer. Light is shed on this phenomenon by constructing what is called strongly universal reservoir systems as random projections of a family of state-space systems that generate Volterra series expansions. This procedure yields a state-affine reservoir system with randomly generated coefficients in a dimension that is logarithmically reduced with respect to the original system. This reservoir system is able to approximate any element in the fading memory filters class just by training a different linear readout for each different filter. Explicit expressions for the probability distributions needed in the generation of the projected reservoir system are stated, and bounds for the committed approximation error are provided.
引用
收藏
页码:6321 / 6330
页数:10
相关论文
共 73 条
[61]   Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data [J].
Pathak, Jaideep ;
Lu, Zhixin ;
Hunt, Brian R. ;
Girvan, Michelle ;
Ott, Edward .
CHAOS, 2017, 27 (12)
[62]  
Rahimi A., 2007, PROC INT C NEURAL IN, P1177
[63]   Minimum Complexity Echo State Network [J].
Rodan, Ali ;
Tino, Peter .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (01) :131-144
[64]  
Teichmann, 2021, APPROXIMATION DYNAMI
[65]  
Vandoorne K., 2014, NATURE COMMUN, V5, P78, DOI DOI 10.1038/NCOMMS4541
[66]   Parallel Reservoir Computing Using Optical Amplifiers [J].
Vandoorne, Kristof ;
Dambre, Joni ;
Verstraeten, David ;
Schrauwen, Benjamin ;
Bienstman, Peter .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (09) :1469-1481
[67]  
Vershynin R., 2012, INTRO NONASYMPTOTIC, P210, DOI [DOI 10.1017/CBO9780511794308.006, 10.1017/CBO9780511794308.006]
[68]   High-performance photonic reservoir computer based on a coherently driven passive cavity [J].
Vinckier, Quentin ;
Duport, Francois ;
Smerieri, Anteo ;
Vandoorne, Kristof ;
Bienstman, Peter ;
Haelterman, Marc ;
Massar, Serge .
OPTICA, 2015, 2 (05) :438-446
[69]  
Wilson-Nunn D, 2018, 2018 IEEE 2ND INTERNATIONAL WORKSHOP ON ARABIC AND DERIVED SCRIPT ANALYSIS AND RECOGNITION (ASAR), P135, DOI 10.1109/ASAR.2018.8480300
[70]  
Xie ZC, 2016, INT C PATT RECOG, P4011