Conditioning and time representation in long short-term memory networks

被引:9
|
作者
Rivest, Francois [1 ,2 ]
Kalaska, John F. [3 ]
Bengio, Yoshua [4 ]
机构
[1] Royal Mil Coll Canada, Dept Math & Comp Sci, Stn Forces, Kingston, ON K7K 7B4, Canada
[2] Queens Univ, Ctr Neurosci Studies, Kingston, ON, Canada
[3] Univ Montreal, Dept Physiol, Montreal, PQ H3C 3J7, Canada
[4] Univ Montreal, Dept Comp Sci & Operat Res, Montreal, PQ, Canada
关键词
Time representation learning; Temporal-difference learning; Long short-term memory networks; Dopamine; Conditioning; Reinforcement learning; PARAMETRIC WORKING-MEMORY; MONKEY DOPAMINE NEURONS; REWARD-PREDICTION; PREMOTOR CORTEX; MODEL; RESPONSES; HIPPOCAMPUS; INTERVALS; DYNAMICS; STIMULUS;
D O I
10.1007/s00422-013-0575-1
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Dopaminergic models based on the temporal-difference learning algorithm usually do not differentiate trace from delay conditioning. Instead, they use a fixed temporal representation of elapsed time since conditioned stimulus onset. Recently, a new model was proposed in which timing is learned within a long short-term memory (LSTM) artificial neural network representing the cerebral cortex (Rivest et al. in J Comput Neurosci 28(1):107-130, 2010). In this paper, that model's ability to reproduce and explain relevant data, as well as its ability to make interesting new predictions, are evaluated. The model reveals a strikingly different temporal representation between trace and delay conditioning since trace conditioning requires working memory to remember the past conditioned stimulus while delay conditioning does not. On the other hand, the model predicts no important difference in DA responses between those two conditions when trained on one conditioning paradigm and tested on the other. The model predicts that in trace conditioning, animal timing starts with the conditioned stimulus offset as opposed to its onset. In classical conditioning, it predicts that if the conditioned stimulus does not disappear after the reward, the animal may expect a second reward. Finally, the last simulation reveals that the buildup of activity of some units in the networks can adapt to new delays by adjusting their rate of integration. Most importantly, the paper shows that it is possible, with the proposed architecture, to acquire discharge patterns similar to those observed in dopaminergic neurons and in the cerebral cortex on those tasks simply by minimizing a predictive cost function.
引用
收藏
页码:23 / 48
页数:26
相关论文
共 50 条
  • [1] Conditioning and time representation in long short-term memory networks
    Francois Rivest
    John F. Kalaska
    Yoshua Bengio
    Biological Cybernetics, 2014, 108 : 23 - 48
  • [2] Landslide displacement prediction based on time series and long short-term memory networks
    Jin, Anjie
    Yang, Shasha
    Huang, Xuri
    BULLETIN OF ENGINEERING GEOLOGY AND THE ENVIRONMENT, 2024, 83 (07)
  • [3] Using long short-term memory networks to predict energy consumption of air-conditioning systems
    Zhou, Chonggang
    Fang, Zhaosong
    Xu, Xiaoning
    Zhang, Xuelin
    Ding, Yunfei
    Jiang, Xiangyang
    Ji, Ying
    SUSTAINABLE CITIES AND SOCIETY, 2020, 55
  • [4] Reliability Estimation Using Long Short-Term Memory Networks
    Davila-Frias, Alex
    Khumprom, Phattara
    Yadav, Om Prakash
    2023 ANNUAL RELIABILITY AND MAINTAINABILITY SYMPOSIUM, RAMS, 2023,
  • [5] A New Delay Connection for Long Short-Term Memory Networks
    Wang, Jianyong
    Zhang, Lei
    Chen, Yuanyuan
    Yi, Zhang
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2018, 28 (06)
  • [6] Matching Biomedical Ontologies with Long Short-Term Memory Networks
    Jiang, Chao
    Xue, Xingsi
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 2484 - 2489
  • [7] Towards real-time respiratory motion prediction based on long short-term memory neural networks
    Lin, Hui
    Shi, Chengyu
    Wang, Brian
    Chan, Maria F.
    Tang, Xiaoli
    Ji, Wei
    PHYSICS IN MEDICINE AND BIOLOGY, 2019, 64 (08):
  • [8] Deepfake Detection using Capsule Networks and Long Short-Term Memory Networks
    Mehra, Akul
    Spreeuwers, Luuk
    Strisciuglio, Nicola
    VISAPP: PROCEEDINGS OF THE 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL. 4: VISAPP, 2021, : 407 - 414
  • [9] Long-term and short-term memory networks based on forgetting memristors
    Liu, Yi
    Chen, Ling
    Li, Chuandong
    Liu, Xin
    Zhou, Wenhao
    Li, Ke
    SOFT COMPUTING, 2023, 27 (23) : 18403 - 18418
  • [10] Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
    Bolboaca, Roland
    Haller, Piroska
    MATHEMATICS, 2023, 11 (06)