TRAINING FULLY RECURRENT NEURAL NETWORKS WITH COMPLEX WEIGHTS

被引:36
|
作者
KECHRIOTIS, G
MANOLAKOS, ES
机构
[1] Center for Communications and Digital Signal Processing (CDSP) Research and Graduate Studies, Electrical and Computer Engineering Dept., Northeastern University, Boston, MA
来源
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING | 1994年 / 41卷 / 03期
关键词
D O I
10.1109/82.279210
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this brief paper, the Real Time Recurrent Learning (RTRL) algorithm for training fully recurrent neural networks in real time, is extended for the case of a recurrent neural network whose inputs, outputs, weights and activation functions are complex. A practical definition of the complex activation function is adopted and the complex form of the conventional RTRL algorithm is derived. The performance of the proposed algorithm is demonstrated with an application in complex communication channel equalization.
引用
收藏
页码:235 / 238
页数:4
相关论文
共 50 条
  • [31] Training recurrent neural networks with Leap-frog
    Holm, JEW
    Kotze, NJH
    IEEE INTERNATIONAL SYMPOSIUM ON INDUSTRIAL ELECTRONICS (ISIE 98) - PROCEEDINGS, VOLS 1 AND 2, 1998, : 99 - 104
  • [32] Training recurrent neural networks using a hybrid algorithm
    Ben Nasr, Mounir
    Chtourou, Mohamed
    NEURAL COMPUTING & APPLICATIONS, 2012, 21 (03): : 489 - 496
  • [33] Efficient and effective training of sparse recurrent neural networks
    Shiwei Liu
    Iftitahu Ni’mah
    Vlado Menkovski
    Decebal Constantin Mocanu
    Mykola Pechenizkiy
    Neural Computing and Applications, 2021, 33 : 9625 - 9636
  • [34] Optimal Training Sequences for Locally Recurrent Neural Networks
    Patan, Krzysztof
    Patan, Maciej
    ARTIFICIAL NEURAL NETWORKS - ICANN 2009, PT I, 2009, 5768 : 80 - 89
  • [35] AN AUGMENTED LAGRANGIAN METHOD FOR TRAINING RECURRENT NEURAL NETWORKS
    Wang, Yue
    Zhang, Chao
    Chen, Xiaojun
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2025, 47 (01): : C22 - C51
  • [36] Taming the Reservoir: Feedforward Training for Recurrent Neural Networks
    Obst, Oliver
    Riedmiller, Martin
    2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [37] Training recurrent neural networks using a hybrid algorithm
    Mounir Ben Nasr
    Mohamed Chtourou
    Neural Computing and Applications, 2012, 21 : 489 - 496
  • [38] Nonlinear Bayesian Filters for Training Recurrent Neural Networks
    Arasaratnam, Ienkaran
    Haykin, Simon
    MICAI 2008: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2008, 5317 : 12 - 33
  • [39] Gaussian sum filters for recurrent neural networks training
    Todorovic, Branimir
    Stankovic, Miomir
    Moraga, Claudio
    NEUREL 2006: EIGHT SEMINAR ON NEURAL NETWORK APPLICATIONS IN ELECTRICAL ENGINEERING, PROCEEDINGS, 2006, : 53 - +
  • [40] Constrained Training of Recurrent Neural Networks for Automata Learning
    Aichernig, Bernhard K.
    Koenig, Sandra
    Mateis, Cristinel
    Pferscher, Andrea
    Schmidt, Dominik
    Tappler, Martin
    SOFTWARE ENGINEERING AND FORMAL METHODS, SEFM 2022, 2022, 13550 : 155 - 172