Prescribed attractivity region selection for recurrent neural networks based on deep reinforcement learning

被引:1
作者
Bao, Gang [1 ]
Song, Zhenyan [1 ]
Xu, Rui [1 ]
机构
[1] China Three Gorges Univ, Hubei Key Lab Cascaded Hydropower Stat Operat & C, Yichang 443002, Peoples R China
基金
中国国家自然科学基金;
关键词
Recurrent neural networks; Attractivity region selection; Deep reinforcement learning; GLOBAL EXPONENTIAL STABILITY; TIME-VARYING DELAYS; DESIGN;
D O I
10.1007/s00521-023-09191-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent neural networks' (RNNs') outputs are the same when network states converge to the same saturation region. Strong external inputs can cause the neural network to converge to a prescribed saturation region. Different from previous works, this paper employs deep reinforcement learning to obtain external inputs to make network states converge to the desired saturation region. Firstly, for five-dimensional neural networks, the deep Q learning (DQN) algorithm is used to compute the optimal external inputs that make the network state converge to the specified saturation region. When scaling to n-dimensional RNNs, the problem of dimensional disaster is encountered. Then, it proposes a batch computation of the external inputs to cope with the curse of dimensionality. At last, the proposed method is validated by numerical examples, and compared with existing methods, it shows that less conservative external inputs conditions can be obtained.
引用
收藏
页码:2399 / 2409
页数:11
相关论文
共 44 条
  • [11] Dynamics of stimuli-based fractional-order memristor-coupled tabu learning two-neuron model and its engineering applications
    Ding, Dawei
    Chen, Xiaoyu
    Yang, Zongli
    Hu, Yongbing
    Wang, Mouyuan
    Niu, Yan
    [J]. NONLINEAR DYNAMICS, 2023, 111 (02) : 1791 - 1817
  • [12] Global exponential stability of delayed complex-valued neural networks with discontinuous activation functions
    Hu, Jin
    Tan, Haidong
    Zeng, Chunna
    [J]. NEUROCOMPUTING, 2020, 416 : 1 - 11
  • [13] Stability analysis of neural networks with time-varying delay using a new augmented Lyapunov-Krasovskii functional
    Hua, Changchun
    Wang, Yibo
    Wu, Shuangshuang
    [J]. NEUROCOMPUTING, 2019, 332 : 1 - 9
  • [14] Stability of antiperiodic recurrent neural networks with multiproportional delays
    Huang, Chuangxia
    Long, Xin
    Cao, Jinde
    [J]. MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2020, 43 (09) : 6093 - 6102
  • [15] Effects of external stimulations on transition behaviors in neural network with time-delay
    Huang, Shoufang
    Zhang, Jiqian
    Hu, Chin-Kun
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2019, 536
  • [16] Novel heuristic bidirectional-recurrent neural network framework for multiclass sentiment analysis classification using coot optimization
    Krosuri, Lakshmi Revathi
    Aravapalli, Rama Satish
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (05) : 13637 - 13657
  • [17] Novel heuristic-based hybrid ResNeXt with recurrent neural network to handle multi class classification of sentiment analysis
    Krosuri, Lakshmi Revathi
    Aravapalli, Rama Satish
    [J]. MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2023, 4 (01):
  • [18] Less conservative stability criteria for general neural networks through novel delay-dependent functional
    Lee, S. H.
    Park, M. J.
    Kwon, O. M.
    Choi, S. G.
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2022, 420
  • [19] A general quadratic negative-determination lemma for stability analysis of delayed neural networks
    Liu, Fang
    Guo, Weiru
    Zou, Runmin
    Liu, Kangzhi
    [J]. NEUROCOMPUTING, 2022, 501 : 463 - 470
  • [20] Multistability analysis of switched fractional-order recurrent neural networks with time-varying delay
    Liu, Peng
    Xu, Minglin
    Li, Yunliu
    Yu, Peizhao
    Li, Sanyi
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (23) : 21089 - 21100