Gradient Neural Network with Nonlinear Activation for Computing Inner Inverses and the Drazin Inverse

被引:35
作者
Stanimirovic, Predrag S. [1 ]
Petkovic, Marko D. [1 ]
Gerontitis, Dimitrios [2 ]
机构
[1] Univ Nis, Fac Sci & Math, Visegradska 33, Nish 18000, Serbia
[2] Aristotele Panepistim, Thessalonikis, Greece
关键词
Recurrent neural network; Moore-Penrose inverse; Drazin inverse; Dynamic equation; Activation function; ZNN MODELS; CONVERGENCE; DYNAMICS;
D O I
10.1007/s11063-017-9705-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Two gradient-based recurrent neural networks (GNNs) for solving two matrix equations are presented and investigated. These GNNs can be used for generating various inner inverses, including the Moore-Penrose, and in the computation of the Drazin inverse. Convergence properties of defined GNNs are considered. Certain conditions which impose convergence towards the pseudoinverse, and the Drazin inverse are exactly specified. The influence of nonlinear activation functions on the convergence performance of defined GNN models is investigated. Computer simulation experience further confirms the theoretical results.
引用
收藏
页码:109 / 133
页数:25
相关论文
共 30 条
  • [11] Complex recurrent neural network for computing the inverse and pseudo-inverse of the complex matrix
    Song, JY
    Yam, Y
    APPLIED MATHEMATICS AND COMPUTATION, 1998, 93 (2-3) : 195 - 205
  • [12] On the investigation of activation functions in gradient neural network for online solving linear matrix equation
    Tan, Zhiguo
    Hu, Yueming
    Chen, Ke
    NEUROCOMPUTING, 2020, 413 : 185 - 192
  • [13] Experiments in neural network inverse modelling based control for a class of nonlinear systems
    Petlenkov, E
    Rüstern, E
    BEC 2004: PROCEEDING OF THE 9TH BIENNIAL BALTIC ELECTRONICS CONFERENCE, 2004, : 145 - 148
  • [14] Toward Fuzzy Activation Function Activated Zeroing Neural Network for Currents Computing
    Jin, Jie
    Chen, Weijie
    Ouyang, Aijia
    Liu, Haiyan
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2023, 70 (11) : 4201 - 4205
  • [15] Zeroing neural network approaches for computing time-varying minimal rank outer inverse
    Stanimirovic, Predrag S.
    Mourtas, Spyridon D.
    Mosic, Dijana
    Katsikis, Vasilios N.
    Cao, Xinwei
    Li, Shuai
    APPLIED MATHEMATICS AND COMPUTATION, 2024, 465
  • [16] EMPIRICALLY ACCELERATING SCALED GRADIENT PROJECTION USING DEEP NEURAL NETWORK FOR INVERSE PROBLEMS IN IMAGE PROCESSING
    Lee, Byung Hyun
    Chun, Se Young
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 1415 - 1419
  • [17] Hardware-Driven Nonlinear Activation for Stochastic Computing Based Deep Convolutional Neural Networks
    Li, Ji
    Yuan, Zihao
    Li, Zhe
    Ding, Caiwen
    Ren, Ao
    Qiu, Qinru
    Draper, Jeffrey
    Wang, Yanzhi
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 1230 - 1236
  • [18] An efficient second-order neural network model for computing the Moore-Penrose inverse of matrices
    Li, Lin
    Hu, Jianhao
    IET SIGNAL PROCESSING, 2022, 16 (09) : 1106 - 1117
  • [19] Efficient Inverse Fractional Neural Network-Based Simultaneous Schemes for Nonlinear Engineering Applications
    Shams, Mudassir
    Carpentieri, Bruno
    FRACTAL AND FRACTIONAL, 2023, 7 (12)
  • [20] Accelerated Gradient Approach For Deep Neural Network-Based Adaptive Control of Unknown Nonlinear Systems
    Le, Duc M.
    Patil, Omkar Sudhir
    Nino, Cristian F.
    Dixon, Warren E.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15