Overcoming the Curse of Dimensionality in the Numerical Approximation of Parabolic Partial Differential Equations with Gradient-Dependent Nonlinearities

被引:0
作者
Martin Hutzenthaler
Arnulf Jentzen
Thomas Kruse
机构
[1] University of Duisburg-Essen,Faculty of Mathematics
[2] University of Münster,Applied Mathematics: Institute for Analysis and Numerics, Faculty of Mathematics and Computer Science
[3] ETH Zürich,Seminar for Applied Mathematics, Department of Mathematics
[4] University of Gießen,Institute of Mathematics
来源
Foundations of Computational Mathematics | 2022年 / 22卷
关键词
Curse of dimensionality; Partial differential equation; PDE; Backward stochastic differential equation; BSDE; Multilevel Picard; Multilevel Monte Carlo; Gradient-dependent nonlinearity; 65M75;
D O I
暂无
中图分类号
学科分类号
摘要
Partial differential equations (PDEs) are a fundamental tool in the modeling of many real-world phenomena. In a number of such real-world phenomena the PDEs under consideration contain gradient-dependent nonlinearities and are high-dimensional. Such high-dimensional nonlinear PDEs can in nearly all cases not be solved explicitly, and it is one of the most challenging tasks in applied mathematics to solve high-dimensional nonlinear PDEs approximately. It is especially very challenging to design approximation algorithms for nonlinear PDEs for which one can rigorously prove that they do overcome the so-called curse of dimensionality in the sense that the number of computational operations of the approximation algorithm needed to achieve an approximation precision of size ε>0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\varepsilon }> 0$$\end{document} grows at most polynomially in both the PDE dimension d∈N\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$d \in \mathbb {N}$$\end{document} and the reciprocal of the prescribed approximation accuracy ε\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\varepsilon }$$\end{document}. In particular, to the best of our knowledge there exists no approximation algorithm in the scientific literature which has been proven to overcome the curse of dimensionality in the case of a class of nonlinear PDEs with general time horizons and gradient-dependent nonlinearities. It is the key contribution of this article to overcome this difficulty. More specifically, it is the key contribution of this article (i) to propose a new full-history recursive multilevel Picard approximation algorithm for high-dimensional nonlinear heat equations with general time horizons and gradient-dependent nonlinearities and (ii) to rigorously prove that this full-history recursive multilevel Picard approximation algorithm does indeed overcome the curse of dimensionality in the case of such nonlinear heat equations with gradient-dependent nonlinearities.
引用
收藏
页码:905 / 966
页数:61
相关论文
共 74 条
[1]  
Beck C(2020)Overcoming the curse of dimensionality in the numerical approximation of Allen-Cahn partial differential equations via truncated full-history recursive multilevel Picard approximations Journal of Numerical Mathematics 28 197-222
[2]  
Hornung F(2020)Numerical simulations for full history recursive multilevel Picard approximations for systems of high-dimensional partial differential equations Commun. Comput. Phys. 28 2109-2138
[3]  
Hutzenthaler M(2019)Deep optimal stopping Journal of Machine Learning Research 20 1-25
[4]  
Jentzen A(2018)A unified deep artificial neural network approach to partial differential equations in complex geometries Neurocomputing 317 28-41
[5]  
Kruse T(2020)Analysis of the generalization error: Empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations SIAM Journal on Mathematics of Data Science 2 631-657
[6]  
Becker S(2017)Numerical approximation of BSDEs using local polynomial drivers and branching processes Monte Carlo Methods and Applications 23 241-263
[7]  
Braunwarth R(2004)Discrete-time approximation and Monte-Carlo simulation of backward stochastic differential equations Stochastic Processes and their applications 111 175-206
[8]  
Hutzenthaler M(2014)Simulation of BSDEs by Wiener chaos expansion The Annals of Applied Probability 24 1129-1171
[9]  
Jentzen A(2019)Machine learning for semi linear PDEs J. Sci. Comput. 79 1667-1712
[10]  
von Wurstemberger P(1997)Backward stochastic differential equations in finance Mathematical finance 7 1-71