THE SUPER-TURING COMPUTATIONAL POWER OF PLASTIC RECURRENT NEURAL NETWORKS

被引:48
作者
Cabessa, Jeremie [1 ]
Siegelmann, Hava T. [2 ]
机构
[1] Univ Paris 02, Lab Math Econ LEMMA, F-75006 Paris, France
[2] Univ Massachusetts, Dept Comp Sci, Biologically Inspired Neural & Dynam Syst Lab, Amherst, MA 01003 USA
基金
美国国家科学基金会;
关键词
Neural networks; plastic neural networks; neural computation; Turing machines; Turing machines with advice; super-Turing; plasticity; evolvability; adaptability; learning; computational capabilities; SYNAPTIC PLASTICITY; TIME; TERMS; NOISE;
D O I
10.1142/S0129065714500294
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study the computational capabilities of a biologically inspired neural model where the synaptic weights, the connectivity pattern, and the number of neurons can evolve over time rather than stay static. Our study focuses on the mere concept of plasticity of the model so that the nature of the updates is assumed to be not constrained. In this context, we show that the so-called plastic recurrent neural networks (RNNs) are capable of the precise super-Turing computational power - as the static analog neural networks - irrespective of whether their synaptic weights are modeled by rational or real numbers, and moreover, irrespective of whether their patterns of plasticity are restricted to bi-valued updates or expressed by any other more general form of updating. Consequently, the incorporation of only bi-valued plastic capabilities in a basic model of RNNs suffices to break the Turing barrier and achieve the super-Turing level of computation. The consideration of more general mechanisms of architectural plasticity or of real synaptic weights does not further increase the capabilities of the networks. These results support the claim that the general mechanism of plasticity is crucially involved in the computational and dynamical capabilities of biological neural networks. They further show that the super-Turing level of computation reflects in a suitable way the capabilities of brain-like models of computation.
引用
收藏
页数:22
相关论文
共 54 条
  • [1] Synaptic plasticity: taming the beast
    Abbott, L. F.
    Nelson, Sacha B.
    [J]. NATURE NEUROSCIENCE, 2000, 3 (11) : 1178 - 1183
  • [2] EVOLVING RBF NEURAL NETWORKS FOR ADAPTIVE SOFT-SENSOR DESIGN
    Alexandridis, Alex
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2013, 23 (06)
  • [3] [Anonymous], 1948, Technical report
  • [4] [Anonymous], 2000, P 9 FINN AI C STEP 2
  • [5] [Anonymous], THESIS TU MUNCHEN
  • [6] [Anonymous], 1987, P IEEE 1 INT C NEURA
  • [7] [Anonymous], 1989, Progress in Neural Networks
  • [8] [Anonymous], 1969, Perceptrons: An Introduction to Computational Geometry, DOI DOI 10.7551/MITPRESS/11301.001.0001
  • [9] [Anonymous], CHAOS
  • [10] [Anonymous], PROGRAMMERS HDB MANC