Time-varying learning rate for recurrent neural networks to solve linear equations

被引:0
作者
Chen, Yuhuan [1 ]
Chen, Jingjing [2 ]
Yi, Chengfu [2 ]
机构
[1] Guangzhou Maritime Univ, Sch Informat & Commun Engn, Guangzhou, Peoples R China
[2] Guangdong Polytech Normal Univ, Sch Cyber Secur, Guangzhou 510635, Peoples R China
基金
中国国家自然科学基金;
关键词
convergence; gradient-based neural networks; recurrent neural networks; time-varying learning rate; OPTIMIZATION;
D O I
10.1002/mma.8835
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Consider there exist many time-varying factors during the recurrent neural networks training in piratical application engineering. In this paper, different from the time-invariant case, a time-varying learning rate is presented to accelerate the convergence speed for the recurrent neural networks (RNNs) when used to solve the linear simultaneous equations. Theoretical analysis shows that the neural networks with the time-varying learning rate can globally converge to the theoretical solution; moreover, if the linear activation function is used, the model will be exponential convergent to the theoretical solution, and if the sign-bi-power activation function is used, the state solution by the neural networks will be convergent in finite time. Computer simulation results further substantiate that the time-varying learning rate can be used for accelerating the convergence rate of the RNN models.
引用
收藏
页数:8
相关论文
共 20 条
[1]   Gradient-based optimizer: A new metaheuristic optimization algorithm [J].
Ahmadianfar, Iman ;
Bozorg-Haddad, Omid ;
Chu, Xuefeng .
INFORMATION SCIENCES, 2020, 540 :131-159
[2]   Cost-efficient numerical algorithm for solving the linear inverse problem of finding a variable magnetization [J].
Akimova, Elena N. ;
Martyshko, Petr S. ;
Misilov, Vladimir E. ;
Miftakhov, Valeriy O. .
MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2020, 43 (13) :7647-7656
[3]   Solving many-electron Schrodinger equation using deep neural networks [J].
Han, Jiequn ;
Zhang, Linfeng ;
E, Weinan .
JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 399
[4]   Asymptotic stability and synchronization of fractional order Hopfield neural networks with unbounded delay [J].
He, Bin-bin ;
Zhou, Hua-Cheng .
MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2023, 46 (03) :3157-3175
[5]   Neural network solution of pantograph type differential equations [J].
Hou, Chih-Chun ;
Simos, Theodore E. ;
Famelis, Ioannis Th. .
MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2020, 43 (06) :3369-3374
[6]   Stability of antiperiodic recurrent neural networks with multiproportional delays [J].
Huang, Chuangxia ;
Long, Xin ;
Cao, Jinde .
MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2020, 43 (09) :6093-6102
[7]   A Noise-Suppressing Neural Algorithm for Solving the Time-Varying System of Linear Equations: A Control-Based Approach [J].
Jin, Long ;
Li, Shuai ;
Hu, Bin ;
Liu, Mei ;
Yu, Jiguo .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2019, 15 (01) :236-246
[8]   Manipulability Optimization of Redundant Manipulators Using Dynamic Neural Networks [J].
Jin, Long ;
Li, Shuai ;
Hung Manh La ;
Luo, Xin .
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2017, 64 (06) :4710-4720
[9]   Learning Rate for Convex Support Tensor Machines [J].
Lian, Heng .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (08) :3755-3760
[10]   Network flows that solve least squares for linear equations [J].
Liu, Yang ;
Lou, Youcheng ;
Anderson, Brian D. O. ;
Shi, Guodong .
AUTOMATICA, 2020, 120