Adaptive Importance Sampling for Deep Ritz

被引:0
作者
Wan, Xiaoliang [1 ,2 ]
Zhou, Tao [3 ]
Zhou, Yuancheng [3 ]
机构
[1] Louisiana State Univ, Dept Math, Baton Rouge, LA 70803 USA
[2] Louisiana State Univ, Ctr Computat & Technol, Baton Rouge, LA 70803 USA
[3] Chinese Acad Sci, Inst Computat Math & Sci Engn Comp, LSEC, AMSS, Beijing 100190, Peoples R China
关键词
Importance sampling; Deep Ritz method; Bounded KRnet; ALGORITHM;
D O I
10.1007/s42967-024-00422-w
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We introduce an adaptive sampling method for the Deep Ritz method aimed at solving partial differential equations (PDEs). Two deep neural networks are used. One network is employed to approximate the solution of PDEs, while the other one is a deep generative model used to generate new collocation points to refine the training set. The adaptive sampling procedure consists of two main steps. The first step is solving the PDEs using the Deep Ritz method by minimizing an associated variational loss discretized by the collocation points in the training set. The second step involves generating a new training set, which is then used in subsequent computations to further improve the accuracy of the current approximate solution. We treat the integrand in the variational loss as an unnormalized probability density function (PDF) and approximate it using a deep generative model called bounded KRnet. The new samples and their associated PDF values are obtained from the bounded KRnet. With these new samples and their associated PDF values, the variational loss can be approximated more accurately by importance sampling. Compared to the original Deep Ritz method, the proposed adaptive method improves the accuracy, especially for problems characterized by low regularity and high dimensionality. We demonstrate the effectiveness of our new method through a series of numerical experiments.
引用
收藏
页码:929 / 953
页数:25
相关论文
共 19 条
  • [1] FROM KNOTHE'S TRANSPORT TO BRENIER'S MAP AND A CONTINUATION METHOD FOR OPTIMAL TRANSPORT
    Carlier, G.
    Galichon, A.
    Santambrogio, F.
    [J]. SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2010, 41 (06) : 2554 - 2576
  • [2] Chen JR, 2020, Arxiv, DOI arXiv:2005.04554
  • [3] Failure-Informed Adaptive Sampling for PINNs, Part II: Combining with Re-sampling and Subset Simulation
    Gao, Zhiwei
    Tang, Tao
    Yan, Liang
    Zhou, Tao
    [J]. COMMUNICATIONS ON APPLIED MATHEMATICS AND COMPUTATION, 2024, 6 (03) : 1720 - 1741
  • [4] FAILURE-INFORMED ADAPTIVE SAMPLING FOR PINNs
    Gao, Zhiwei
    Yan, Liang
    Zhou, Tao
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (04) : A1971 - A1994
  • [5] Solving high-dimensional partial differential equations using deep learning
    Han, Jiequn
    Jentzen, Arnulf
    Weinan, E.
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2018, 115 (34) : 8505 - 8510
  • [6] Deep Residual Learning for Image Recognition
    He, Kaiming
    Zhang, Xiangyu
    Ren, Shaoqing
    Sun, Jian
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 770 - 778
  • [7] Deep Ritz method with adaptive quadrature for linear elasticity
    Liu, Min
    Cai, Zhiqiang
    Ramani, Karthik
    [J]. COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2023, 415
  • [8] DeepXDE: A Deep Learning Library for Solving Differential Equations
    Lu, Lu
    Meng, Xuhui
    Mao, Zhiping
    Karniadakis, George Em
    [J]. SIAM REVIEW, 2021, 63 (01) : 208 - 228
  • [9] Morin P., 2002, SIAM REV, V44, P631
  • [10] Papamakarios G, 2021, J MACH LEARN RES, V22