Sensitive Finite-State Computations Using a Distributed Network With a Noisy Network Attractor

被引:8
作者
Ashwin, Peter [1 ,2 ]
Postlethwaite, Claire [3 ]
机构
[1] Univ Exeter, Dept Math, Ctr Syst Dynam & Control, Exeter EX4 4QF, Devon, England
[2] Univ Exeter, EPSRC Ctr Predict Modelling Healthcare, Exeter EX4 4QJ, Devon, England
[3] Univ Auckland, Dept Math, Auckland 1142, New Zealand
基金
英国工程与自然科学研究理事会;
关键词
Artificial neural networks; nonlinear dynamical systems; turing machines (TMs); HETEROCLINIC NETWORKS; DYNAMICS;
D O I
10.1109/TNNLS.2018.2813404
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We exhibit a class of smooth continuous-state neural-inspired networks composed of simple nonlinear elements that can be made to function as a finite-state computational machine. We give an explicit construction of arbitrary finite-state virtual machines in the spatiotemporal dynamics of the network. The dynamics of the functional network can be completely characterized as a "noisy network attractor" in phase space operating in either an "excitable" or a "free-running" regime, respectively, corresponding to excitable or heteroclinic connections between states. The regime depends on the sign of an "excitability parameter." Viewing the network as a nonlinear stochastic differential equation where a deterministic (signal) and/or a stochastic (noise) input is applied to any element, we explore the influence of the signal-to-noise ratio on the error rate of the computations. The free-running regime is extremely sensitive to inputs: arbitrarily small amplitude perturbations can be used to perform computations with the system as long as the input dominates the noise. We find a counter-intuitive regime where increasing noise amplitude can lead to more, rather than less, accurate computation. We suggest that noisy network attractors will be useful for understanding neural networks that reliably and sensitively perform finite-state computations in a noisy environment.
引用
收藏
页码:5847 / 5858
页数:12
相关论文
共 38 条
[1]  
[Anonymous], 1979, Introduction to Automata Theory, Languages, and Computation
[2]  
[Anonymous], 2010, Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting
[3]  
Asarin E, 2005, LECT NOTES COMPUT SC, V3580, P1031
[4]   Discrete computation using a perturbed heteroclinic network [J].
Ashwin, P ;
Borresen, J .
PHYSICS LETTERS A, 2005, 347 (4-6) :208-214
[5]   Dynamics on networks of cluster states for globally coupled phase oscillators [J].
Ashwin, Peter ;
Orosz, Gabor ;
Wordsworth, John ;
Townley, Stuart .
SIAM JOURNAL ON APPLIED DYNAMICAL SYSTEMS, 2007, 6 (04) :728-758
[6]   Quantifying Noisy Attractors: From Heteroclinic to Excitable Networks [J].
Ashwin, Peter ;
Postlethwaite, Claire .
SIAM JOURNAL ON APPLIED DYNAMICAL SYSTEMS, 2016, 15 (04) :1989-2016
[7]   Designing Heteroclinic and Excitable Networks in Phase Space Using Two Populations of Coupled Cells [J].
Ashwin, Peter ;
Postlethwaite, Claire .
JOURNAL OF NONLINEAR SCIENCE, 2016, 26 (02) :345-364
[8]   Mathematical Frameworks for Oscillatory Network Dynamics in Neuroscience [J].
Ashwin, Peter ;
Coombes, Stephen ;
Nicks, Rachel .
JOURNAL OF MATHEMATICAL NEUROSCIENCE, 2016, 6 :1-92
[9]   On designing heteroclinic networks from graphs [J].
Ashwin, Peter ;
Postlethwaite, Claire .
PHYSICA D-NONLINEAR PHENOMENA, 2013, 265 :26-39
[10]   Investigating Echo-State Networks Dynamics by Means of Recurrence Analysis [J].
Bianchi, Filippo Maria ;
Livi, Lorenzo ;
Alippi, Cesare .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (02) :427-439