Sobolev-Type Embeddings for Neural Network Approximation Spaces

被引:0
作者
Philipp Grohs
Felix Voigtlaender
机构
[1] University of Vienna,Faculty of Mathematics
[2] Research Network Data Science @ Uni Vienna,Department of Mathematics
[3] Johann Radon Institute,undefined
[4] Technical University of Munich,undefined
来源
Constructive Approximation | 2023年 / 57卷
关键词
Deep neural networks; Approximation spaces; Hölder spaces; Embedding theorems; Optimal learning algorithms; Primary: 68T07; 46E35; Secondary: 65D05; 46E30;
D O I
暂无
中图分类号
学科分类号
摘要
We consider neural network approximation spaces that classify functions according to the rate at which they can be approximated (with error measured in Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L^p$$\end{document}) by ReLU neural networks with an increasing number of coefficients, subject to bounds on the magnitude of the coefficients and the number of hidden layers. We prove embedding theorems between these spaces for different values of p. Furthermore, we derive sharp embeddings of these approximation spaces into Hölder spaces. We find that, analogous to the case of classical function spaces (such as Sobolev spaces, or Besov spaces) it is possible to trade “smoothness” (i.e., approximation rate) for increased integrability. Combined with our earlier results in Grohs and Voigtlaender (Proof of the theory-to-practice gap in deep learning via sampling complexity bounds for neural network approximation spaces, 2021. arXiv preprint arXiv:2104.02746), our embedding theorems imply a somewhat surprising fact related to “learning” functions from a given neural network space based on point samples: if accuracy is measured with respect to the uniform norm, then an optimal “learning” algorithm for reconstructing functions that are well approximable by ReLU neural networks is simply given by piecewise constant interpolation on a tensor product grid.
引用
收藏
页码:579 / 599
页数:20
相关论文
共 50 条
[31]   Moser-type trace inequalities for generalized Lorentz-Sobolev spaces [J].
Cerny, Robert .
REVISTA MATEMATICA COMPLUTENSE, 2015, 28 (02) :303-357
[32]   Approximation in shift-invariant spaces with deep ReLU neural networks [J].
Yang, Yunfei ;
Li, Zhen ;
Wang, Yang .
NEURAL NETWORKS, 2022, 153 :269-281
[33]   CHEBYSHEV FEATURE NEURAL NETWORK FOR ACCURATE FUNCTION APPROXIMATION [J].
Xu, Zhongshu ;
Chen, Yuan ;
Xiu, Dongbin .
JOURNAL OF MACHINE LEARNING FOR MODELING AND COMPUTING, 2025, 6 (02) :29-42
[34]   Compact Embeddings of Bessel-Potential-Type Spaces into Generalized Holder Spaces Involving k-Modulus of Smoothness [J].
Gogatishvili, Amiran ;
Neves, Julio S. ;
Opic, Bohumir .
ZEITSCHRIFT FUR ANALYSIS UND IHRE ANWENDUNGEN, 2011, 30 (01) :1-27
[35]   Optimal Embeddings of Bessel-Potential-Type Spaces into Generalized Holder Spaces Involving k-Modulus of Smoothness [J].
Gogatishvili, Amiran ;
Neves, Julio S. ;
Opic, Bohumir .
POTENTIAL ANALYSIS, 2010, 32 (03) :201-228
[36]   USING ACOUSTIC DEEP NEURAL NETWORK EMBEDDINGS TO DETECT MULTIPLE SCLEROSIS FROM SPEECH [J].
Gosztolya, Gabor ;
Toth, Laszlo ;
Svindt, Veronika ;
Bona, Judit ;
Hoffmann, Ildiko .
2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, :6927-6931
[37]   Deep Neural Network Approximation of Nonlinear Model Predictive Control [J].
Cao, Yankai ;
Gopaluni, R. Bhushan .
IFAC PAPERSONLINE, 2020, 53 (02) :11319-11324
[38]   Optimal Embeddings of Bessel-Potential-Type Spaces into Generalized Hölder Spaces Involving k-Modulus of Smoothness [J].
Amiran Gogatishvili ;
Júlio S. Neves ;
Bohumír Opic .
Potential Analysis, 2010, 32 :201-228
[39]   DEEP NEURAL NETWORK-BASED SPEAKER EMBEDDINGS FOR END-TO-END SPEAKER VERIFICATION [J].
Snyder, David ;
Ghahremani, Pegah ;
Povey, Daniel ;
Garcia-Romero, Daniel ;
Carmiel, Yishay ;
Khudanpur, Sanjeev .
2016 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY (SLT 2016), 2016, :165-170
[40]   UL'YANOV-TYPE INEQUALITIES AND EMBEDDINGS BETWEEN BESOV SPACES: THE CASE OF PARAMETERS WITH LIMIT VALUES [J].
Dominguez, Oscar .
MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (03) :755-772