Reservoir stack machines

被引:0
|
作者
Paassen, Benjamin [1 ]
Schulz, Alexander [2 ]
Hammer, Barbara [2 ]
机构
[1] Humboldt Univ, Rudower Chausee 25, D-12489 Berlin, Germany
[2] Bielefeld Univ, Inspirat 1, D-33619 Bielefeld, Germany
关键词
Reservoir computing; Randomized machine learning; Context-free languages; LR automata; Memory-augmented neural networks; FINITE AUTOMATA; NETWORKS;
D O I
10.1016/j.neucom.2021.05.106
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Memory-augmented neural networks equip a recurrent neural network with an explicit memory to support tasks that require information storage without interference over long times. A key motivation for such research is to perform classic computation tasks, such as parsing. However, memory-augmented neural networks are notoriously hard to train, requiring many backpropagation epochs and a lot of data. In this paper, we introduce the reservoir stack machine, a model which can provably recognize all deterministic context-free languages and circumvents the training problem by training only the output layer of a recurrent net and employing auxiliary information during training about the desired interaction with a stack. In our experiments, we validate the reservoir stack machine against deep and shallow networks from the literature on three benchmark tasks for Neural Turing machines and six deterministic context free languages. Our results show that the reservoir stack machine achieves zero error, even on test sequences longer than the training data, requiring only a few seconds of training time and 100 training sequences. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:352 / 364
页数:13
相关论文
共 50 条
  • [1] Reservoir stack machines
    Paaßen, Benjamin
    Schulz, Alexander
    Hammer, Barbara
    Neurocomputing, 2022, 470 : 352 - 364
  • [2] Nondeterministic stack register machines
    Clote, P
    THEORETICAL COMPUTER SCIENCE, 1997, 178 (1-2) : 37 - 76
  • [3] Loading characteristics of weight stack machines
    Grigas, V.
    Satkunskiene, D.
    Domeika, A.
    Tolocka, R. -T.
    Rimsa, G.
    JOURNAL OF VIBROENGINEERING, 2007, 9 (03) : 74 - 77
  • [4] GENERATION OF OPTIMAL CODE FOR STACK MACHINES
    BRUNO, JL
    LASSAGNE, T
    JOURNAL OF THE ACM, 1975, 22 (03) : 382 - 396
  • [5] Self-Adjusting Stack Machines
    Hammer, Matthew A.
    Neis, Georg
    Chen, Yan
    Acar, Umut A.
    OOPSLA 11: PROCEEDINGS OF THE 2011 ACM INTERNATIONAL CONFERENCE ON OBJECT ORIENTED PROGRAMMING SYSTEMS LANGUAGES AND APPLICATIONS, 2011, : 753 - 772
  • [6] Self-Adjusting Stack Machines
    Hammer, Matthew A.
    Neis, Georg
    Chen, Yan
    Acar, Umut A.
    ACM SIGPLAN NOTICES, 2011, 46 (10) : 753 - 772
  • [7] Tradeoff lower lounds for stack machines
    Matei David
    Periklis A. Papakonstantinou
    computational complexity, 2014, 23 : 99 - 146
  • [8] Tradeoff lower lounds for stack machines
    David, Matei
    Papakonstantinou, Periklis A.
    COMPUTATIONAL COMPLEXITY, 2014, 23 (01) : 99 - 146
  • [9] STACK-MACHINES AND COMPILER DESIGN
    MILLER, DL
    BYTE, 1987, 12 (04): : 177 - &
  • [10] Adapting EPIC architecture's register stack for virtual stack machines
    Tayeb, Jamel
    Niar, Smail
    DSD 2006: 9TH EUROMICRO CONFERENCE ON DIGITAL SYSTEM DESIGN: ARCHITECTURES, METHODS AND TOOLS, PROCEEDINGS, 2006, : 204 - +