Construction and approximation rate for feedforward neural network operators with sigmoidal functions

被引:8
|
作者
Yu, Dansheng [1 ]
Cao, Feilong [2 ]
机构
[1] Hangzhou Normal Univ, Sch Math, Hangzhou 310036, Zhejiang, Peoples R China
[2] China Jiliang Univ, Coll Sci, Dept Appl Math, Hangzhou 310018, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Feedforward neural networks; Approximation rate; Sigmoidal function; Modulus of continuity; INTERPOLATION; ERROR; BOUNDS;
D O I
10.1016/j.cam.2024.116150
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
It is well known that feedforward neural networks (FNNs) are universal approximators. This fact constitutes the theoretical basis for FNN learning. Constructing FNNs and estimating their errors of approximation are important and interesting, which not only theoretically verify the generalization ability of the constructed FNNs without training, but also avoid the local minimum solution in the parameters optimization process. This paper aims to construct several classes of FNNs with generalized sigmoidal function, i.e., the sigmoidal function satisfying certain asymptotic behaviours. Another main purpose of this paper is to estimate approximation rates for the constructed FNNs. Using the modulus of continuity of function as a metric, the upper bounds of approximation are estimated. In particular, the results of this paper contain the corresponding results for FNNs consisting of some classical sigmoidal functions.
引用
收藏
页数:16
相关论文
共 50 条