TRAINING FULLY RECURRENT NEURAL NETWORKS WITH COMPLEX WEIGHTS

被引:36
|
作者
KECHRIOTIS, G
MANOLAKOS, ES
机构
[1] Center for Communications and Digital Signal Processing (CDSP) Research and Graduate Studies, Electrical and Computer Engineering Dept., Northeastern University, Boston, MA
来源
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING | 1994年 / 41卷 / 03期
关键词
D O I
10.1109/82.279210
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this brief paper, the Real Time Recurrent Learning (RTRL) algorithm for training fully recurrent neural networks in real time, is extended for the case of a recurrent neural network whose inputs, outputs, weights and activation functions are complex. A practical definition of the complex activation function is adopted and the complex form of the conventional RTRL algorithm is derived. The performance of the proposed algorithm is demonstrated with an application in complex communication channel equalization.
引用
收藏
页码:235 / 238
页数:4
相关论文
共 50 条
  • [41] A linear approximation for training Recurrent Random Neural Networks
    Halici, U
    Karaöz, E
    ADVANCES IN COMPUTER AND INFORMATION SCIENCES '98, 1998, 53 : 149 - 156
  • [42] SEQUENCE-DISCRIMINATIVE TRAINING OF RECURRENT NEURAL NETWORKS
    Voigtlaender, Paul
    Doetsch, Patrick
    Wiesler, Simon
    Schlueter, Ralf
    Ney, Hermann
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 2100 - 2104
  • [43] A hybrid training procedure applied to recurrent neural networks
    Loiseau, X
    Sendler, J
    APPLICATIONS AND SCIENCE OF ARTIFICIAL NEURAL NETWORKS II, 1996, 2760 : 467 - 474
  • [44] Multiobjective hybrid optimization and training of recurrent neural networks
    Delgado, Miguel
    Cuellar, Manuel P.
    Pegalajar, Maria Carmen
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2008, 38 (02): : 381 - 403
  • [45] Efficient and effective training of sparse recurrent neural networks
    Liu, Shiwei
    Ni'mah, Iftitahu
    Menkovski, Vlado
    Mocanu, Decebal Constantin
    Pechenizkiy, Mykola
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15): : 9625 - 9636
  • [46] An EM Based Training Algorithm for Recurrent Neural Networks
    Unkelbach, Jan
    Yi, Sun
    Schmidhuber, Juergen
    ARTIFICIAL NEURAL NETWORKS - ICANN 2009, PT I, 2009, 5768 : 964 - 974
  • [47] ENHANCED TRAINING FOR THE LOCALLY RECURRENT PROBABILISTIC NEURAL NETWORKS
    Ganchev, Todor
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2009, 18 (06) : 853 - 881
  • [48] Toward Training Recurrent Neural Networks for Lifelong Learning
    Sodhani, Shagun
    Chandar, Sarath
    Bengio, Yoshua
    NEURAL COMPUTATION, 2020, 32 (01) : 1 - 35
  • [49] LOW: Training deep neural networks by learning optimal sample weights
    Santiago, Carlos
    Barata, Catarina
    Sasdelli, Michele
    Carneiro, Gustavo
    Nascimento, Jacinto C.
    PATTERN RECOGNITION, 2021, 110
  • [50] LOW: Training deep neural networks by learning optimal sample weights
    Santiago, Carlos
    Barata, Catarina
    Sasdelli, Michele
    Carneiro, Gustavo
    Nascimento, Jacinto C.
    Pattern Recognition, 2021, 110