Backpropagation through Time Algorithm for Training Recurrent Neural Networks using Variable Length Instances

被引:0
|
作者
Grau, Isel [1 ]
Napoles, Gonzalo [1 ]
Bonet, Isis [2 ]
Garcia, Maria Matilde [1 ]
机构
[1] Univ Cent Marta Abreu Las Villas, Ctr Estudios Informat, Las Villas, Cuba
[2] Escuela Ingn Antioquia, Antioquia, Colombia
来源
COMPUTACION Y SISTEMAS | 2013年 / 17卷 / 01期
关键词
Recurrent neural networks; backpropagation through time; sequence analysis; bioinformatics; artificial earthquakes;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Artificial Neural Networks (ANNs) are grouped within connectionist techniques of Artificial Intelligence. In particular, Recurrent Neural Networks are a type of ANN which is widely used in signal reproduction tasks and sequence analysis, where causal relationships in time and space take place. On the other hand, in many problems of science and engineering, signals or sequences under analysis do not always have the same length, making it difficult to select a computational technique for information processing. This article presents a flexible implementation of Recurrent Neural Networks which allows designing the desired topology based on specific application problems. Furthermore, the proposed model is capable of learning to use knowledge bases with instances of variable length in an efficient manner. The performance of the suggested implementation is evaluated through a study case of bioinformatics sequence classification. We also mention its application in obtaining artificial earthquakes from seismic scenarios similar to Cuba.
引用
收藏
页码:15 / 24
页数:10
相关论文
共 50 条
  • [31] Reproducing chaos by variable structure recurrent neural networks
    Felix, RA
    Sanchez, EN
    Chen, GR
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (06): : 1450 - 1457
  • [32] On extreme learning machines in sequential and time series prediction: A non-iterative and approximate training algorithm for recurrent neural networks
    Rizk, Yara
    Awad, Mariette
    NEUROCOMPUTING, 2019, 325 : 1 - 19
  • [33] Time Series Forecasting Using Neural Networks: Are Recurrent Connections Necessary?
    Abdulkarim, Salihu A.
    Engelbrecht, Andries P.
    NEURAL PROCESSING LETTERS, 2019, 50 (03) : 2763 - 2795
  • [34] Memetic evolutionary training for recurrent neural networks:: an application to time-series prediction
    Delgado, M
    Pegalajar, MC
    Cuéllar, MP
    EXPERT SYSTEMS, 2006, 23 (02) : 99 - 115
  • [35] Identification of nonlinear time varying systems using recurrent neural networks
    Zou, GF
    Wang, ZO
    8TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING, VOLS 1-3, PROCEEDING, 2001, : 611 - 615
  • [36] Using time-discrete recurrent neural networks in nonlinear control
    Kolb, T
    Ilg, W
    Wille, J
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 1367 - 1371
  • [37] Improving Time Series' Forecast Errors by Using Recurrent Neural Networks
    Ashour, Marwan Abdul Hammed
    Abbas, Rabab Alayham
    PROCEEDINGS OF 2018 7TH INTERNATIONAL CONFERENCE ON SOFTWARE AND COMPUTER APPLICATIONS (ICSCA 2018), 2018, : 229 - 232
  • [38] Time Series Forecasting Using Neural Networks: Are Recurrent Connections Necessary?
    Salihu A. Abdulkarim
    Andries P. Engelbrecht
    Neural Processing Letters, 2019, 50 : 2763 - 2795
  • [39] A new boosting algorithm for improved time-series forecasting with recurrent neural networks
    Assaad, Mohammad
    Bone, Romuald
    Cardot, Hubert
    INFORMATION FUSION, 2008, 9 (01) : 41 - 55
  • [40] SEQUENCE-DISCRIMINATIVE TRAINING OF RECURRENT NEURAL NETWORKS
    Voigtlaender, Paul
    Doetsch, Patrick
    Wiesler, Simon
    Schlueter, Ralf
    Ney, Hermann
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 2100 - 2104