Learning Multiple Timescales in Recurrent Neural Networks

被引:6
作者
Alpay, Tayfun [1 ]
Heinrich, Stefan [1 ]
Wermter, Stefan [1 ]
机构
[1] Univ Hamburg, Dept Informat, Knowledge Technol, Vogt Kolln Str 30, D-22527 Hamburg, Germany
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I | 2016年 / 9886卷
关键词
Recurrent Neural Networks; Sequence learning; Multiple timescales; Leaky activation; Clocked activation;
D O I
10.1007/978-3-319-44778-0_16
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent Neural Networks (RNNs) are powerful architectures for sequence learning. Recent advances on the vanishing gradient problem have led to improved results and an increased research interest. Among recent proposals are architectural innovations that allow the emergence of multiple timescales during training. This paper explores a number of architectures for sequence generation and prediction tasks with long-term relationships. We compare the Simple Recurrent Network (SRN) and Long Short-Term Memory (LSTM) with the recently proposed Clockwork RNN (CWRNN), Structurally Constrained Recurrent Network (SCRN), and Recurrent Plausibility Network (RPN) with regard to their capabilities of learning multiple timescales. Our results show that partitioning hidden layers under distinct temporal constraints enables the learning of multiple timescales, which contributes to the understanding of the fundamental conditions that allow RNNs to self-organize to accurate temporal abstractions.
引用
收藏
页码:132 / 139
页数:8
相关论文
共 14 条
  • [1] [Anonymous], 1968, Information Theory and Statistics
  • [2] Arevian G, 2007, LECT NOTES COMPUT SC, V4669, P425
  • [3] LEARNING LONG-TERM DEPENDENCIES WITH GRADIENT DESCENT IS DIFFICULT
    BENGIO, Y
    SIMARD, P
    FRASCONI, P
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02): : 157 - 166
  • [4] Bengio Y, 2013, INT CONF ACOUST SPEE, P8624, DOI 10.1109/ICASSP.2013.6639349
  • [5] Glorot X., 2010, P 13 INT C ART INT S, P249, DOI DOI 10.1109/LGRS.2016.2565705
  • [6] Graves A, 2013, ARXIV13080850
  • [7] Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
  • [8] Optimization and applications of echo state networks with leaky-integrator neurons
    Jaegera, Herbert
    Lukosevicius, Mantas
    Popovici, Dan
    Siewert, Udo
    [J]. NEURAL NETWORKS, 2007, 20 (03) : 335 - 352
  • [9] Jozefowicz R, 2015, PR MACH LEARN RES, V37, P2342
  • [10] Koutník J, 2014, PR MACH LEARN RES, V32, P1863