Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black-Scholes Partial Differential Equations

被引:91
作者
Berner, Julius [1 ]
Grohs, Philipp [1 ,2 ]
Jentzen, Arnulf [3 ,4 ]
机构
[1] Univ Vienna, Fac Math, A-1090 Vienna, Austria
[2] Univ Vienna, Res Platform DataSci UniVienna, A-1090 Vienna, Austria
[3] Swiss Fed Inst Technol, Dept Math, Zurich, Switzerland
[4] Univ Munster, Fac Math & Comp Sci, D-48149 Munster, Germany
来源
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE | 2020年 / 2卷 / 03期
基金
奥地利科学基金会;
关键词
deep learning; curse of dimensionality; Kolmogorov equation; generalization error; empirical risk minimization; BOUNDS; ALGORITHM;
D O I
10.1137/19M125649X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The development of new classification and regression algorithms based on empirical risk minimization (ERM) over deep neural network hypothesis classes, coined deep learning, revolutionized the area of artificial intelligence, machine learning, and data analysis. In particular, these methods have been applied to the numerical solution of high-dimensional partial differential equations with great success. Recent simulations indicate that deep learning-based algorithms are capable of overcoming the curse of dimensionality for the numerical solution of Kolmogorov equations, which are widely used in models from engineering, finance, and the natural sciences. The present paper considers under which conditions ERM over a deep neural network hypothesis class approximates the solution of a d-dimensional Kolmogorov equation with affine drift and diffusion coefficients and typical initial values arising from problems in computational finance up to error epsilon. We establish that, with high probability over draws of training samples, such an approximation can be achieved with both the size of the hypothesis class and the number of training samples scaling only polynomially in d and epsilon(-1). It can be concluded that ERM over deep neural network hypothesis classes overcomes the curse of dimensionality for the numerical solution of linear Kolmogorov equations with affine coefficients.
引用
收藏
页码:631 / 657
页数:27
相关论文
共 62 条
[1]  
Aliprantis C., 2007, INFINITE DIMENSIONAL
[2]  
Allen-Zhu Z, 2019, PR MACH LEARN RES, V97
[3]  
Ames W., 2014, COMPUT SCI SCI COMPU
[4]  
[Anonymous], 2010, Partial Differential Equations
[5]  
[Anonymous], 2009, Neural network learning: Theoretical foundations
[6]  
Arnold L., 1974, equations: theory and applications
[7]  
Arora S, 2018, PR MACH LEARN RES, V80
[8]  
Bartlett P., 2017, ADV NEUR IN, P6240
[9]  
Bartlett PL, 2019, J MACH LEARN RES, V20, P1
[10]   Local Rademacher complexities [J].
Bartlett, PL ;
Bousquet, O ;
Mendelson, S .
ANNALS OF STATISTICS, 2005, 33 (04) :1497-1537