Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks

被引:0
|
作者
Sebastian Bitzer
Stefan J. Kiebel
机构
[1] MPI for Human Cognitive and Brain Sciences,
来源
Biological Cybernetics | 2012年 / 106卷
关键词
Recurrent neural networks; Bayesian inference; Nonlinear dynamics; Human motion;
D O I
暂无
中图分类号
学科分类号
摘要
Recurrent neural networks (RNNs) are widely used in computational neuroscience and machine learning applications. In an RNN, each neuron computes its output as a nonlinear function of its integrated input. While the importance of RNNs, especially as models of brain processing, is undisputed, it is also widely acknowledged that the computations in standard RNN models may be an over-simplification of what real neuronal networks compute. Here, we suggest that the RNN approach may be made computationally more powerful by its fusion with Bayesian inference techniques for nonlinear dynamical systems. In this scheme, we use an RNN as a generative model of dynamic input caused by the environment, e.g. of speech or kinematics. Given this generative RNN model, we derive Bayesian update equations that can decode its output. Critically, these updates define a ‘recognizing RNN’ (rRNN), in which neurons compute and exchange prediction and prediction error messages. The rRNN has several desirable features that a conventional RNN does not have, e.g. fast decoding of dynamic stimuli and robustness to initial conditions and noise. Furthermore, it implements a predictive coding scheme for dynamic inputs. We suggest that the Bayesian inversion of RNNs may be useful both as a model of brain function and as a machine learning tool. We illustrate the use of the rRNN by an application to the online decoding (i.e. recognition) of human kinematics.
引用
收藏
页码:201 / 217
页数:16
相关论文
共 50 条
  • [41] MARKOV RECURRENT NEURAL NETWORKS
    Kuo, Che-Yu
    Chien, Jen-Tzung
    2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2018,
  • [42] Noisy Recurrent Neural Networks
    Lim, Soon Hoe
    Erichson, N. Benjamin
    Hodgkinson, Liam
    Mahoney, Michael W.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [43] Shuffling Recurrent Neural Networks
    Rotman, Michael
    Wolf, Lior
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9428 - 9435
  • [44] Identifiability of recurrent neural networks
    Al-Falou, AA
    Trummer, D
    ECONOMETRIC THEORY, 2003, 19 (05) : 812 - 828
  • [45] Locally recurrent neural networks
    Patan, Krzysztof
    Lecture Notes in Control and Information Sciences, 2008, 377 : 29 - 63
  • [46] Bidirectional recurrent neural networks
    Schuster, M
    Paliwal, KK
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1997, 45 (11) : 2673 - 2681
  • [47] Dilated Recurrent Neural Networks
    Chang, Shiyu
    Zhang, Yang
    Han, Wei
    Yu, Mo
    Guo, Xiaoxiao
    Tan, Wei
    Cui, Xiaodong
    Witbrock, Michael
    Hasegawa-Johnson, Mark
    Huang, Thomas S.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [48] Flexible Recurrent Neural Networks
    Lambert, Anne
    Le Bolzer, Francoise
    Schnitzler, Francois
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT I, 2021, 12457 : 694 - 709
  • [49] Discontinuities in recurrent neural networks
    Gavaldà, R
    Siegelmann, HT
    NEURAL COMPUTATION, 1999, 11 (03) : 715 - 745
  • [50] Convolutional Neural Networks with Recurrent Neural Filters
    Yang, Yi
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 912 - 917