TRAINING FULLY RECURRENT NEURAL NETWORKS WITH COMPLEX WEIGHTS

被引:36
|
作者
KECHRIOTIS, G
MANOLAKOS, ES
机构
[1] Center for Communications and Digital Signal Processing (CDSP) Research and Graduate Studies, Electrical and Computer Engineering Dept., Northeastern University, Boston, MA
关键词
D O I
10.1109/82.279210
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this brief paper, the Real Time Recurrent Learning (RTRL) algorithm for training fully recurrent neural networks in real time, is extended for the case of a recurrent neural network whose inputs, outputs, weights and activation functions are complex. A practical definition of the complex activation function is adopted and the complex form of the conventional RTRL algorithm is derived. The performance of the proposed algorithm is demonstrated with an application in complex communication channel equalization.
引用
收藏
页码:235 / 238
页数:4
相关论文
共 50 条
  • [1] TRAINING FULLY RECURRENT NEURAL NETWORKS ON A RING TRANSPUTER ARRAY
    KECHRIOTIS, G
    MANOLAKOS, ES
    MICROPROCESSORS AND MICROSYSTEMS, 1994, 18 (01) : 5 - 11
  • [2] Continuous Attractors of Recurrent Neural Networks with Complex-valued Weights
    Li, Jun
    Yang, Jian
    Diao, Yongfeng
    2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [3] Training spatially homogeneous fully recurrent neural networks in eigenvalue space
    Perfetti, R
    Massarelli, E
    NEURAL NETWORKS, 1997, 10 (01) : 125 - 137
  • [4] Determination of weights for relaxation recurrent neural networks
    Serpen, G
    Livingston, DL
    NEUROCOMPUTING, 2000, 34 : 145 - 168
  • [5] Optimizing Recurrent Neural Networks: A Study on Gradient Normalization of Weights for Enhanced Training Efficiency
    Wu, Xinyi
    Xiang, Bingjie
    Lu, Huaizheng
    Li, Chaopeng
    Huang, Xingwang
    Huang, Weifang
    APPLIED SCIENCES-BASEL, 2024, 14 (15):
  • [6] An Efficient Algorithm for Complex-Valued Neural Networks Through Training Input Weights
    Liu, Qin
    Sang, Zhaoyang
    Chen, Hua
    Wang, Jian
    Zhang, Huaqing
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT IV, 2017, 10637 : 150 - 159
  • [7] Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
    Hubara, Itay
    Courbariaux, Matthieu
    Soudry, Daniel
    El-Yaniv, Ran
    Bengio, Yoshua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [8] Training of a class of recurrent neural networks
    Shaaban, EM
    ISCAS '98 - PROCEEDINGS OF THE 1998 INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-6, 1998, : B78 - B81
  • [9] Complex Gated Recurrent Neural Networks
    Wolter, Moritz
    Yao, Angela
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [10] Rapid training of quantum recurrent neural networks
    Michał Siemaszko
    Adam Buraczewski
    Bertrand Le Saux
    Magdalena Stobińska
    Quantum Machine Intelligence, 2023, 5