共 50 条
Density Results by Deep Neural Network Operators with Integer Weights
被引:10
|作者:
Costarelli, Danilo
[1
]
机构:
[1] Univ Perugia Via Vanvitelli, Dept Math & Comp Sci, I-06123 Perugia, Italy
关键词:
deep neural networks;
neural network operators;
density results;
ReLU activa-tion function;
RePUs activation functions;
sigmoidal functions;
SIMULTANEOUS APPROXIMATIONS;
INTERPOLATION;
BOUNDS;
D O I:
10.3846/mma.2022.15974
中图分类号:
O1 [数学];
学科分类号:
0701 ;
070101 ;
摘要:
In the present paper, a new family of multi-layers (deep) neural network (NN) operators is introduced. Density results have been established in the space of continuous functions on [-1, 1], with respect to the uniform norm. First, the case of the operators with two-layers is considered in detail, then the definition and the corresponding density results have been extended to the general case of multi-layers operators. All the above definitions allow us to prove approximation results by a con-structive approach, in the sense that, for any given f all the weights, the thresholds, and the coefficients of the deep NN operators can be explicitly determined. Finally, examples of activation functions have been provided, together with graphical exam-ples. The main motivation of this work resides in the aim to provide the corresponding multi-layers version of the well-known (shallow) NN operators, according to what is done in the applications with the construction of deep neural models.
引用
收藏
页码:547 / 560
页数:14
相关论文