Approximation error estimates by noise-injected neural networks

被引:0
|
作者
Akiyama, Keito [1 ]
机构
[1] Tohoku Univ, Math Inst, 6-3 Aramaki Aza Aoba,Aoba Ku, Sendai 9808578, Japan
基金
日本学术振兴会;
关键词
approximation of functions; approximation order; feedforward neural networks; stochastic perturbations; universal approximation property; VARIABLE-BASIS; BOUNDS; RATES; FRAMEWORK;
D O I
10.1002/mma.10288
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
One-hidden-layer feedforward neural networks are described as functions having many real-valued parameters. Approximation properties of neural networks are established (universal approximation property), and the approximation error is related to the number of parameters in the network. The essentially optimal order of approximation error bounds was already derived in 1996. We focused on the numerical experiment that indicates the neural networks whose parameters contain stochastic perturbations gain better performance than ordinary neural networks and explored the approximation property of neural networks with stochastic perturbations. In this paper, we derived the quantitative order of variance of stochastic perturbations to achieve the essentially optimal approximation order and verified the justifiability of our theory by numerical experiments.
引用
收藏
页码:14563 / 14574
页数:12
相关论文
共 50 条