Approximation error estimates by noise-injected neural networks

被引:0
|
作者
Akiyama, Keito [1 ]
机构
[1] Tohoku Univ, Math Inst, 6-3 Aramaki Aza Aoba,Aoba Ku, Sendai 9808578, Japan
基金
日本学术振兴会;
关键词
approximation of functions; approximation order; feedforward neural networks; stochastic perturbations; universal approximation property; VARIABLE-BASIS; BOUNDS; RATES; FRAMEWORK;
D O I
10.1002/mma.10288
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
One-hidden-layer feedforward neural networks are described as functions having many real-valued parameters. Approximation properties of neural networks are established (universal approximation property), and the approximation error is related to the number of parameters in the network. The essentially optimal order of approximation error bounds was already derived in 1996. We focused on the numerical experiment that indicates the neural networks whose parameters contain stochastic perturbations gain better performance than ordinary neural networks and explored the approximation property of neural networks with stochastic perturbations. In this paper, we derived the quantitative order of variance of stochastic perturbations to achieve the essentially optimal approximation order and verified the justifiability of our theory by numerical experiments.
引用
收藏
页码:14563 / 14574
页数:12
相关论文
共 50 条
  • [1] Noise-injected neural networks show promise for use on small-sample expression data
    Jianping Hua
    James Lowey
    Zixiang Xiong
    Edward R Dougherty
    BMC Bioinformatics, 7
  • [2] Noise-injected neural networks show promise for use on small-sample expression data
    Hua, Jianping
    Lowey, James
    Xiong, Zixiang
    Dougherty, Edward R.
    BMC BIOINFORMATICS, 2006, 7 (1)
  • [3] Effect Of Injected Noise In Deep Neural Networks
    Nagabushan, Naresh
    Satish, Nishank
    Raghuram, S.
    2016 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMPUTING RESEARCH, 2016, : 134 - 138
  • [4] Error bounds for approximation with neural networks
    Burger, M
    Neubauer, A
    JOURNAL OF APPROXIMATION THEORY, 2001, 112 (02) : 235 - 250
  • [5] Approximation error of Fourier neural networks
    Zhumekenov, Abylay
    Takhanov, Rustem
    Castro, Alejandro J.
    Assylbekov, Zhenisbek
    STATISTICAL ANALYSIS AND DATA MINING, 2021, 14 (03) : 258 - 270
  • [6] Convergence and objective functions of noise-injected multilayer perceptrons with hidden multipliers
    Wang, Xiangyu
    Wang, Jian
    Zhang, Kai
    Lin, Feng
    Chang, Qin
    NEUROCOMPUTING, 2021, 452 : 796 - 812
  • [7] The estimate for approximation error of spherical neural networks
    Cao, Feilong
    Wang, Huazhong
    Lin, Shaobo
    MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2011, 34 (15) : 1888 - 1895
  • [8] A mutual information-maximizing quantizer based on the noise-injected threshold array
    Zhai, Qiqing
    Wang, Youguo
    DIGITAL SIGNAL PROCESSING, 2024, 146
  • [9] Noise-injected analog Ising machines enable ultrafast statistical sampling and machine learning
    Bohm, Fabian
    Alonso-Urquijo, Diego
    Verschaffelt, Guy
    Van der Sande, Guy
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [10] The estimate for approximation error of neural networks: A constructive approach
    Cao, Feilong
    Xie, Tingfan
    Xu, Zongben
    NEUROCOMPUTING, 2008, 71 (4-6) : 626 - 630