Learning Temporal Intervals in Neural Dynamics

被引:5
作者
Duran, Boris [1 ]
Sandamirskaya, Yulia [2 ,3 ]
机构
[1] Univ Skovde, Informat Res Ctr, S-54101 Skovde, Sweden
[2] Univ Zurich, Inst Neuroinformat, CH-8057 Zurich, Switzerland
[3] Univ Zurich, Neurosci Ctr, CH-8057 Zurich, Switzerland
关键词
Dynamic neural fields (DNFs); learning timing; memory for duration; neural dynamics; neuromorphic engineering; neurorobotics; DISTAL REWARD PROBLEM; FIELD-THEORY; WORKING-MEMORY; TIME; REPRESENTATION; INFORMATION; CEREBELLUM; MECHANISMS; NETWORKS; MODEL;
D O I
10.1109/TCDS.2017.2676839
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Storing and reproducing temporal intervals is an important component of perception, action generation, and learning. How temporal intervals can be represented in neuronal networks is thus an important research question both in study of biological organisms and artificial neuromorphic systems. Here, we introduce a neural-dynamic computing architecture for learning temporal durations of actions. The architecture uses a dynamic neural fields (DNFs) representation of the elapsed time and a memory trace dynamics to store the experienced action duration. Interconnected dynamical nodes signal beginning of an action, its successful accomplishment, or failure, and activate formation of the memory trace that corresponds to the action's duration. The accumulated memory trace influences the competition between the dynamical nodes in such a way that the failure node gains a competitive advantage earlier if the stored duration is shorter. The model uses neurally based DNF dynamics and is a process model of how temporal durations may be stored in neural systems, both biological and artificial ones. The focus of this paper is on the mechanism to store and use duration in artificial neuronal systems. The model is validated in closed-loop experiments with a simulated robot.
引用
收藏
页码:359 / 372
页数:14
相关论文
共 50 条
  • [31] Probabilistic associative learning suffices for learning the temporal structure of multiple sequences
    Martinez, Ramon H.
    Lansner, Anders
    Herman, Pawel
    PLOS ONE, 2019, 14 (08):
  • [32] Learning Stimulus Intervals-Adaptive Timing of Conditioned Purkinje Cell Responses
    Jirenhed, Dan-Anders
    Hesslow, Germund
    CEREBELLUM, 2011, 10 (03) : 523 - 535
  • [33] Temporal pavlovian conditioning of a model spiking neural network for discrimination sequences of short time intervals
    Park, Woojun
    Kim, Jongmu
    Jeong, Inhoi
    Lee, Kyoung J.
    JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2025, 53 (01) : 163 - 179
  • [34] Learning solid dynamics with graph neural network
    Li, Bohao
    Du, Bowen
    Ye, Junchen
    Huang, Jiajing
    Sun, Leilei
    Feng, Jinyan
    INFORMATION SCIENCES, 2024, 676
  • [35] Quantum Neural Machine Learning: Backpropagation and Dynamics
    Goncalves, Carlos Pedro
    NEUROQUANTOLOGY, 2017, 15 (01) : 22 - 41
  • [36] Temporal Mapper: Transition networks in simulated and real neural dynamics
    Zhang, Mengsen
    Chowdhury, Samir
    Saggar, Manish
    NETWORK NEUROSCIENCE, 2023, 7 (02) : 431 - 460
  • [37] Deep learning incorporating biologically inspired neural dynamics and in-memory computing
    Wozniak, Stanislaw
    Pantazi, Angeliki
    Bohnstingl, Thomas
    Eleftheriou, Evangelos
    NATURE MACHINE INTELLIGENCE, 2020, 2 (06) : 325 - +
  • [38] The Temporal Dynamics of Strategy Execution in Cognitive Skill Learning
    Bajic, Daniel
    Rickard, Timothy C.
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2009, 35 (01) : 113 - 121
  • [39] Choice modulates the neural dynamics of prediction error processing during rewarded learning
    Peterson, David A.
    Lotz, Daniel T.
    Halgren, Eric
    Sejnowski, Terrence J.
    Poizner, Howard
    NEUROIMAGE, 2011, 54 (02) : 1385 - 1394
  • [40] A shared temporal window of integration across cognitive control and reinforcement learning paradigms: A correlational study
    Vasta, Nicola
    Xu, Shengjie
    Verguts, Tom
    Braem, Senne
    MEMORY & COGNITION, 2025, 53 (03) : 1008 - 1021