Latent attractors: A general paradigm for context-dependent neural computation

被引:0
|
作者
Doboli, Simona [1 ]
Minai, Ali A. [2 ]
机构
[1] Hofstra Univ, Dept Comp Sci, Hempstead, NY 11549 USA
[2] Univ Cincinnati, Dept Elect & Comp Engn & Com Sci, Cincinnati, OH 45221 USA
来源
TRENDS IN NEURAL COMPUTATION | 2007年 / 35卷
基金
美国国家科学基金会;
关键词
attractor networks; recurrent networks; context; sequence learning; modular networks; multi-scale dynamics;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Context is an essential part of all cognitive function. However, neural network models have only considered this issue in limited ways, focusing primarily on the conditioning of a system's response by its recent history. This type of context, which we term Type I, is clearly relevant in many situations; but in other cases; the system's response for an extended period must be conditioned by stimuli encountered at a specific earlier time. For example, the decision to turn left or right at an intersection point in a navigation task depends on the goal set at the beginning of the task. We term this type of context, which sets the "frame of reference" for an entire episode, Type II context. The prefrontal cortex in mammals has been hypothesized to perform this function, but it has been difficult to incorporate this into neural network models. In the present chapter; we describe an approach called latent attractors that allows self-organizing neural systems to simultaneously incorporate both Type I and Type II context dependency. We demonstrate this by applying the approach to a series of problems requiring one or both types of context. We also argue that the latent attractor approach is a. general and flexible method for incorporating multi-scale temporal dependence into neural systems, and possibly other self-organized network models.
引用
收藏
页码:135 / +
页数:9
相关论文
共 50 条
  • [1] Coeffects: A Calculus of Context-Dependent Computation
    Petricek, Tomas
    Orchard, Dominic
    Mycroft, Alan
    ACM SIGPLAN NOTICES, 2014, 49 (09) : 123 - 135
  • [2] Coeffects: A Calculus of Context-Dependent Computation
    Petricek, Tomas
    Orchard, Dominic
    Mycroft, Alan
    ICFP'14: PROCEEDINGS OF THE 2014 ACM SIGPLAN INTERNATIONAL CONFERENCE ON FUNCTIONAL PROGRAMMING, 2014, : 123 - 135
  • [3] Context-Dependent Latent Inhibition in Preweanling Rats
    Revillo, D. A.
    Gaztanaga, M.
    Aranda, E.
    Paglini, M. G.
    Chotro, M. G.
    Arias, C.
    DEVELOPMENTAL PSYCHOBIOLOGY, 2014, 56 (07) : 1507 - 1517
  • [4] A Neural Model for Context-dependent Sequence Learning
    Luc Berthouze
    Adriaan Tijsseling
    Neural Processing Letters, 2006, 23 : 27 - 45
  • [5] A neural model for context-dependent sequence learning
    Berthouze, L
    Tijsseling, A
    NEURAL PROCESSING LETTERS, 2006, 23 (01) : 27 - 45
  • [6] Context-Dependent Spatial Representations in the Hippocampus using Place Cell Dendritic Computation
    Alabi, Adedapo
    Vanderelst, Dieter
    Minai, Ali A.
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [7] Context-dependent arm pointing adaptation
    Seidler, RD
    Bloomberg, JJ
    Stelmach, GE
    BEHAVIOURAL BRAIN RESEARCH, 2001, 119 (02) : 155 - 166
  • [8] Human and monkey P3-like responses in a mixed modality paradigm: effects of context and context-dependent noradrenergic influences
    Pineda, JA
    Westerfield, M
    Kronenberg, BM
    Kubrin, J
    INTERNATIONAL JOURNAL OF PSYCHOPHYSIOLOGY, 1997, 27 (03) : 223 - 240
  • [9] A Framework for Context-dependent User Interface Adaptation
    Koelker, Stephan
    Schwinger, Felix
    Krempels, Karl-Heinz
    WEBIST: PROCEEDINGS OF THE 15TH INTERNATIONAL CONFERENCE ON WEB INFORMATION SYSTEMS AND TECHNOLOGIES, 2019, : 418 - 425
  • [10] Context-dependent savings in procedural category learning
    Crossley, Matthew J.
    Ashby, F. Gregory
    Maddox, W. Todd
    BRAIN AND COGNITION, 2014, 92 : 1 - 10