Backpropagation through Time Algorithm for Training Recurrent Neural Networks using Variable Length Instances

被引:0
作者
Grau, Isel [1 ]
Napoles, Gonzalo [1 ]
Bonet, Isis [2 ]
Garcia, Maria Matilde [1 ]
机构
[1] Univ Cent Marta Abreu Las Villas, Ctr Estudios Informat, Las Villas, Cuba
[2] Escuela Ingn Antioquia, Antioquia, Colombia
来源
COMPUTACION Y SISTEMAS | 2013年 / 17卷 / 01期
关键词
Recurrent neural networks; backpropagation through time; sequence analysis; bioinformatics; artificial earthquakes;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Artificial Neural Networks (ANNs) are grouped within connectionist techniques of Artificial Intelligence. In particular, Recurrent Neural Networks are a type of ANN which is widely used in signal reproduction tasks and sequence analysis, where causal relationships in time and space take place. On the other hand, in many problems of science and engineering, signals or sequences under analysis do not always have the same length, making it difficult to select a computational technique for information processing. This article presents a flexible implementation of Recurrent Neural Networks which allows designing the desired topology based on specific application problems. Furthermore, the proposed model is capable of learning to use knowledge bases with instances of variable length in an efficient manner. The performance of the suggested implementation is evaluated through a study case of bioinformatics sequence classification. We also mention its application in obtaining artificial earthquakes from seismic scenarios similar to Cuba.
引用
收藏
页码:15 / 24
页数:10
相关论文
共 50 条
  • [41] Constrained Training of Recurrent Neural Networks for Automata Learning
    Aichernig, Bernhard K.
    Koenig, Sandra
    Mateis, Cristinel
    Pferscher, Andrea
    Schmidt, Dominik
    Tappler, Martin
    SOFTWARE ENGINEERING AND FORMAL METHODS, SEFM 2022, 2022, 13550 : 155 - 172
  • [42] Gaussian sum filters for recurrent neural networks training
    Todorovic, Branimir
    Stankovic, Miomir
    Moraga, Claudio
    NEUREL 2006: EIGHT SEMINAR ON NEURAL NETWORK APPLICATIONS IN ELECTRICAL ENGINEERING, PROCEEDINGS, 2006, : 53 - +
  • [43] Using Recurrent Neural Networks for Decompilation
    Katz, Deborah S.
    Ruchti, Jason
    Schulte, Eric
    2018 25TH IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ANALYSIS, EVOLUTION AND REENGINEERING (SANER 2018), 2018, : 346 - 356
  • [44] Time series generation by recurrent neural networks
    Priel, A
    Kanter, I
    ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 2003, 39 (03) : 315 - 332
  • [45] Learning various length dependence by dual recurrent neural networks
    Zhang, Chenpeng
    Li, Shuai
    Ye, Mao
    Zhu, Ce
    Li, Xue
    NEUROCOMPUTING, 2021, 466 : 1 - 15
  • [46] Global exponential convergence of recurrent neural networks with variable delays
    Yi, Z
    THEORETICAL COMPUTER SCIENCE, 2004, 312 (2-3) : 281 - 293
  • [47] Multiobjective Evolutionary Optimization of Training and Topology of Recurrent Neural Networks for Time-Series Prediction
    Katagiri, Hideki
    Nishizaki, Ichiro
    Hayashida, Tomohiro
    Kadoma, Takanori
    COMPUTER JOURNAL, 2012, 55 (03) : 325 - 336
  • [48] Recurrent neural networks: A constructive algorithm, and its properties
    Tsoi, AC
    Tan, SH
    NEUROCOMPUTING, 1997, 15 (3-4) : 309 - 326
  • [49] Time Series Generation by Recurrent Neural Networks
    A. Priel
    I. Kanter
    Annals of Mathematics and Artificial Intelligence, 2003, 39 : 315 - 332
  • [50] A conjugate gradient learning algorithm for recurrent neural networks
    Chang, WF
    Mak, MW
    NEUROCOMPUTING, 1999, 24 (1-3) : 173 - 189