Mesh sampling and weighting for the hyperreduction of nonlinear Petrov-Galerkin reduced-order models with local reduced-order bases

被引:40
作者
Grimberg, Sebastian [1 ]
Farhat, Charbel [1 ,2 ,3 ]
Tezaur, Radek [1 ]
Bou-Mosleh, Charbel [4 ]
机构
[1] Stanford Univ, Dept Aeronaut & Astronaut, Durand Bldg,496 Lomita Mall, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Mech Engn, Stanford, CA 94305 USA
[3] Stanford Univ, Inst Computat & Math Engn, Stanford, CA 94305 USA
[4] Notre Dame Univ Louaize, Dept Mech Engn, Zouk Mosbeh, Lebanon
关键词
hyper-reduction; local basis; machine learning; nonlinear model reduction; Petrov-Galerkin; reduced mesh; HYPER-REDUCTION; INTERPOLATION METHOD; PROJECTION;
D O I
10.1002/nme.6603
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The energy-conserving sampling and weighting (ECSW) method is a hyper-reduction method originally developed for accelerating the performance of Galerkin projection-based reduced-order models (PROMs) associated with large-scale finite element models, when the underlying projected operators need to be frequently recomputed as in parametric and/or nonlinear problems. In this paper, this hyper-reduction method is extended to Petrov-Galerkin PROMs where the underlying high-dimensional models can be associated with arbitrary finite element, finite volume, and finite difference semi-discretization methods. Its scope is also extended to cover local PROMs based on piecewise-affine approximation subspaces, such as those designed for mitigating the Kolmogorov n-width barrier issue associated with convection-dominated flow problems. The resulting ECSW method is shown in this paper to be robust and accurate. In particular, its offline phase is shown to be fast and parallelizable, and the potential of its online phase for large-scale applications of industrial relevance is demonstrated for turbulent flow problems with O(10(7)) and O(10(8)) degrees of freedom. For such problems, the online part of the ECSW method proposed in this paper for Petrov-Galerkin PROMs is shown to enable wall-clock time and CPU time speedup factors of several orders of magnitude while delivering exceptional accuracy.
引用
收藏
页码:1846 / 1874
页数:29
相关论文
共 50 条
  • [21] Accelerated construction of projection-based reduced-order models via incremental approaches
    Agouzal, Eki
    Taddei, Tommaso
    ADVANCED MODELING AND SIMULATION IN ENGINEERING SCIENCES, 2024, 11 (01)
  • [22] Reduced-Order Modeling Based on Hybrid Snapshot Simulation
    Bai, Feng
    Wang, Yi
    INTERNATIONAL JOURNAL OF COMPUTATIONAL METHODS, 2021, 18 (01)
  • [23] Reduced-order autodifferentiable ensemble Kalman filters
    Chen, Yuming
    Sanz-Alonso, Daniel
    Willett, Rebecca
    INVERSE PROBLEMS, 2023, 39 (12)
  • [24] Nonlinear reduced-order modelling for limit-cycle oscillation analysis
    Guanqun Gai
    Sebastian Timme
    Nonlinear Dynamics, 2016, 84 : 991 - 1009
  • [25] Reduced-Order Models and Conditional Expectation: Analysing Parametric Low-Order Approximations
    Matthies, Hermann G.
    COMPUTATION, 2025, 13 (02)
  • [26] Reduced-Order Modeling of Deep Neural Networks
    Gusak, J.
    Daulbaev, T.
    Ponomarev, E.
    Cichocki, A.
    Oseledets, I
    COMPUTATIONAL MATHEMATICS AND MATHEMATICAL PHYSICS, 2021, 61 (05) : 774 - 785
  • [27] Reduced-Order Modeling of Deep Neural Networks
    J. Gusak
    T. Daulbaev
    E. Ponomarev
    A. Cichocki
    I. Oseledets
    Computational Mathematics and Mathematical Physics, 2021, 61 : 774 - 785
  • [28] Nonlinear reduced-order modelling for limit-cycle oscillation analysis
    Gai, Guanqun
    Timme, Sebastian
    NONLINEAR DYNAMICS, 2016, 84 (02) : 991 - 1009
  • [29] A dynamic nonlinear optimization framework for learning data-driven reduced-order microkinetic models
    Lejarza, Fernando
    Koninckx, Elsa
    Broadbelt, Linda J.
    Baldea, Michael
    CHEMICAL ENGINEERING JOURNAL, 2023, 462
  • [30] Physics-informed machine learning for reduced-order modeling of nonlinear problems
    Chen, Wenqian
    Wang, Qian
    Hesthaven, Jan S.
    Zhang, Chuhua
    JOURNAL OF COMPUTATIONAL PHYSICS, 2021, 446