Echo State Gaussian Process

被引:91
作者
Chatzis, Sotirios P. [1 ]
Demiris, Yiannis [1 ]
机构
[1] Univ London Imperial Coll Sci Technol & Med, Dept Elect & Elect Engn, London SW7 2BT, England
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2011年 / 22卷 / 09期
关键词
Bayesian inference; Gaussian process; reservoir computing; sequential data modeling;
D O I
10.1109/TNN.2011.2162109
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Echo state networks (ESNs) constitute a novel approach to recurrent neural network (RNN) training, with an RNN (the reservoir) being generated randomly, and only a readout being trained using a simple computationally efficient algorithm. ESNs have greatly facilitated the practical application of RNNs, outperforming classical approaches on a number of benchmark tasks. In this paper, we introduce a novel Bayesian approach toward ESNs, the echo state Gaussian process (ESGP). The ESGP combines the merits of ESNs and Gaussian processes to provide a more robust alternative to conventional reservoir computing networks while also offering a measure of confidence on the generated predictions (in the form of a predictive distribution). We exhibit the merits of our approach in a number of applications, considering both benchmark datasets and real-world applications, where we show that our method offers a significant enhancement in the dynamical data modeling capabilities of ESNs. Additionally, we also show that our method is orders of magnitude more computationally efficient compared to existing Gaussian process-based methods for dynamical data modeling, without compromises in the obtained predictive performance.
引用
收藏
页码:1435 / 1445
页数:11
相关论文
共 33 条
  • [1] [Anonymous], 2007, Uci machine learning repository
  • [2] Generative modeling of autonomous robots and their environments using reservoir computing
    Antonelo, Eric A.
    Schrauwen, Benjamin
    Van Campenhout, Jan
    [J]. NEURAL PROCESSING LETTERS, 2007, 26 (03) : 233 - 249
  • [3] A tighter bound for the echo state property
    Buchner, Michael
    Young, Peter
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (03): : 820 - 824
  • [4] LIBSVM: A Library for Support Vector Machines
    Chang, Chih-Chung
    Lin, Chih-Jen
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
  • [5] Robust Sequential Data Modeling Using an Outlier Tolerant Hidden Markov Model
    Chatzis, Sotirios P.
    Kosmopoulos, Dimitrios I.
    Varvarigou, Theodora A.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2009, 31 (09) : 1657 - 1669
  • [6] Cristianini Nello, 2000, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods, DOI DOI 10.1017/CB09780511801389
  • [7] NEURAL LEARNING OF CHAOTIC DYNAMICS
    DECO, G
    SCHURMANN, B
    [J]. NEURAL PROCESSING LETTERS, 1995, 2 (02) : 23 - 26
  • [8] Making sense of a complex world
    Haykin, S
    Principe, J
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 1998, 15 (03) : 66 - 81
  • [9] 2-DIMENSIONAL MAPPING WITH A STRANGE ATTRACTOR
    HENON, M
    [J]. COMMUNICATIONS IN MATHEMATICAL PHYSICS, 1976, 50 (01) : 69 - 77
  • [10] Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication
    Jaeger, H
    Haas, H
    [J]. SCIENCE, 2004, 304 (5667) : 78 - 80