Deep ReLU networks and high-order finite element methods II: Chebysev emulation

被引:2
作者
Opschoor, Joost A. A. [1 ]
Schwab, Christoph [1 ]
机构
[1] Swiss Fed Inst Technol, Seminar Appl Math, HG G57 1,Ramistr 101, CH-8092 Zurich, Switzerland
关键词
Neural networks; hp-Finite element methods; Chebysev expansions; APPROXIMATION; QUADRATURE; NUMBER; BOUNDS;
D O I
10.1016/j.camwa.2024.06.008
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We show expression rates and stability in Sobolev norms of deep feedforward ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions T of a bounded interval (a, b). Novel constructions of ReLU NN surrogates encoding function approximations in terms of Chebysev polynomial expansion coefficients are developed which require fewer neurons than previous constructions. Chebysev coefficients can be computed easily from the values of the function in the Clenshaw-Curtis points using the inverse fast Fourier transform. Bounds on expression rates and stability are obtained that are superior to those of constructions based on ReLU NN emulations of monomials as considered in [24,22]. All emulation bounds are explicit in terms of the (arbitrary) partition of the interval, the target emulation accuracy and the polynomial degree in each element of the partition. ReLU NN emulation error estimates are provided for various classes of functions and norms, commonly encountered in numerical analysis. In particular, we show exponential ReLU emulation rate bounds for analytic functions with point singularities and develop an interface between Chebfun approximations and constructive ReLU NN emulations.
引用
收藏
页码:142 / 162
页数:21
相关论文
共 47 条
[1]   Memory-optimal neural network approximation [J].
Bolcskei, Helmut ;
Grohs, Philipp ;
Kutyniok, Gitta ;
Petersen, Philipp .
WAVELETS AND SPARSITY XVII, 2017, 10394
[2]  
Brenner S. C., 2008, Texts in Applied Mathematics, DOI DOI 10.1007/978-0-387-75934-0
[3]   EXPONENTIAL CONVERGENCE OF hp QUADRATURE FOR INTEGRAL OPERATORS WITH GEVREY KERNELS [J].
Chernov, Alexey ;
von Petersdorff, Tobias ;
Schwab, Christoph .
ESAIM-MATHEMATICAL MODELLING AND NUMERICAL ANALYSIS-MODELISATION MATHEMATIQUE ET ANALYSE NUMERIQUE, 2011, 45 (03) :387-422
[4]  
Daubechies I, 2022, CONSTR APPROX, V55, P127, DOI 10.1007/s00365-021-09548-z
[5]   On the approximation of functions by tanh neural networks [J].
De Ryck, Tim ;
Lanthaler, Samuel ;
Mishra, Siddhartha .
NEURAL NETWORKS, 2021, 143 :732-750
[6]   Neural network approximation [J].
DeVore, Ronald ;
Hanin, Boris ;
Petrova, Guergana .
ACTA NUMERICA, 2021, 30 :327-444
[7]   DNN Expression Rate Analysis of High-Dimensional PDEs: Application to Option Pricing [J].
Elbraechter, Dennis ;
Grohs, Philipp ;
Jentzen, Arnulf ;
Schwab, Christoph .
CONSTRUCTIVE APPROXIMATION, 2022, 55 (01) :3-71
[8]  
Feischl M, 2020, NUMER MATH, V144, P323, DOI 10.1007/s00211-019-01085-z
[9]  
Gautschi W., 2004, ORTHOGONAL POLYNOMIA, DOI DOI 10.1093/OSO/9780198506720.001.0001
[10]   ON THE MARKOV INEQUALITY IN LP-SPACES [J].
GOETGHELUCK, P .
JOURNAL OF APPROXIMATION THEORY, 1990, 62 (02) :197-205