Probabilistic lower bounds for approximation by shallow perceptron networks

被引:23
作者
Kurkova, Vera [1 ]
Sanguineti, Marcello [2 ]
机构
[1] Czech Acad Sci, Inst Comp Sci, Vodarenskou Vezi 2, Prague 18207, Czech Republic
[2] Univ Genoa, DIBRIS, Via Opera Pia 13, I-16145 Genoa, Italy
关键词
Shallow networks; Perceptrons; Model complexity; Lower bounds on approximation rates; Chernoff-Hoeffding bounds; NEURAL-NETWORKS; COMPUTATIONAL MODELS; INEQUALITIES; COMPLEXITY; OPERATORS;
D O I
10.1016/j.neunet.2017.04.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Limitations of approximation capabilities of shallow perceptron networks are investigated. Lower bounds on approximation errors are derived for binary-valued functions on finite domains. It is proven that unless the number of network units is sufficiently large (larger than any polynomial of the logarithm of the size of the domain) a good approximation cannot be achieved for almost any uniformly randomly chosen function on a given domain. The results are obtained by combining probabilistic Chernoff-Hoeffding bounds with estimates of the sizes of sets of functions exactly computable by shallow networks with increasing numbers of units. (C) 2017 Elsevier Ltd. All rights reserved.
引用
收藏
页码:34 / 41
页数:8
相关论文
共 47 条
  • [1] Anastassiou GA, 2011, INTEL SYST REF LIBR, V19, P1, DOI 10.1007/978-3-642-21431-8
  • [2] [Anonymous], 2016, Learning Real and Boolean Functions: when Is Deep Better than Shallow
  • [3] [Anonymous], 1962, THESIS PRINCETON U
  • [4] [Anonymous], 45 CBMM
  • [5] [Anonymous], 1901, Theorie der vielfachen Kontinuitat
  • [6] [Anonymous], 2016, ARXIV160707110
  • [7] [Anonymous], 2016, Applied Math. Sciences.
  • [8] Anthony M., 2001, ADV NEURAL INFORM PR
  • [9] DYNAMIC PROGRAMMING
    BELLMAN, R
    [J]. SCIENCE, 1966, 153 (3731) : 34 - &
  • [10] BENGIO Y, 2006, ADV NEURAL INFORM PR, V18, P107