Convergence of online gradient method for pi-sigma neural networks

被引:0
作者
Department of Applied Mathematics, Dalian University of Technology, Dalian 116024, China [1 ]
不详 [2 ]
机构
来源
J. Comput. Inf. Syst. | 2007年 / 6卷 / 2345-2352期
关键词
Neural networks;
D O I
暂无
中图分类号
学科分类号
摘要
引用
收藏
相关论文
共 50 条
[21]   A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions [J].
Jentzen, Arnulf ;
Riekert, Adrian .
ZEITSCHRIFT FUR ANGEWANDTE MATHEMATIK UND PHYSIK, 2022, 73 (05)
[22]   CONVERGENCE TIME ON THE RS MODEL FOR NEURAL NETWORKS [J].
Penna, T. J. P. ;
de Oliveira, P. M. C. ;
Arenzon, J. J. ;
de Almeida, R. M. C. ;
Iglesias, J. R. .
INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 1991, 2 (03) :711-717
[23]   Convergence of discrete delayed Hopfield neural networks [J].
Ma, Runnian ;
Xie, Yu ;
Zhang, Shengrui ;
Liu, Wenbin .
COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2009, 57 (11-12) :1869-1876
[24]   Interpolation and rates of convergence for a class of neural networks [J].
Cao, Feilong ;
Zhang, Yongquan ;
He, Ze-Rong .
APPLIED MATHEMATICAL MODELLING, 2009, 33 (03) :1441-1456
[25]   Variable Order Fractional Gradient Descent Method and Its Application in Neural Networks Optimization [J].
Lou, Weipu ;
Gao, Wei ;
Han, Xianwei ;
Zhang, Yimin .
2022 34TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2022, :109-114
[26]   A new spectral conjugate gradient method for unconstrained optimization and its application in neural networks [J].
Abdulrahman, Asmaa M. ;
Fathi, Bayda G. ;
Najm, Huda Y. .
JOURNAL OF MATHEMATICS AND COMPUTER SCIENCE-JMCS, 2025, 36 (03) :326-332
[27]   Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks [J].
Neftci, Emre O. ;
Mostafa, Hesham ;
Zenke, Friedemann .
IEEE SIGNAL PROCESSING MAGAZINE, 2019, 36 (06) :51-63
[28]   Weight and Gradient Centralization in Deep Neural Networks [J].
Fuhl, Wolfgang ;
Kasneci, Enkelejda .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 :227-239
[29]   A Hessian-Free Gradient Flow (HFGF) method for the optimisation of deep learning neural networks [J].
Zhang, Sushen ;
Chen, Ruijuan ;
Du, Wenyu ;
Yuan, Ye ;
Vassiliadis, Vassilios S. .
COMPUTERS & CHEMICAL ENGINEERING, 2020, 141
[30]   Study on fast speed fractional order gradient descent method and its application in neural networks [J].
Wang, Yong ;
He, Yuli ;
Zhu, Zhiguang .
NEUROCOMPUTING, 2022, 489 :366-376