Learning physics-constrained subgrid-scale closures in the small-data regime for stable and accurate LES

被引:29
作者
Guan, Yifei [1 ]
Subel, Adam [1 ]
Chattopadhyay, Ashesh [1 ]
Hassanzadeh, Pedram [1 ,2 ]
机构
[1] Rice Univ, Dept Mech Engn, Houston, TX 77005 USA
[2] Rice Univ, Dept Earth Environm & Planetary Sci, Houston, TX 77005 USA
关键词
Large eddy simulation; Deep Learning; Turbulence; Physics constraints; Small data; NEURAL-NETWORKS; MODELS; BACKSCATTER; FRAMEWORK; ENERGY;
D O I
10.1016/j.physd.2022.133568
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We demonstrate how incorporating physics constraints into convolutional neural networks (CNNs) enables learning subgrid-scale (SGS) closures for stable and accurate large-eddy simulations (LES) in the small-data regime (i.e., when the availability of high-quality training data is limited). Using several setups of forced 2D turbulence as the testbeds, we examine the a priori and a posteriori performance of three methods for incorporating physics: (1) data augmentation (DA), (2) CNN with group convolutions (GCNN), and (3) loss functions that enforce a global enstrophy-transfer conservation (EnsCon). While the data-driven closures from physics-agnostic CNNs trained in the big-data regime are accurate and stable, and outperform dynamic Smagorinsky (DSMAG) closures, their performances substantially deteriorate when these CNNs are trained with 40x fewer samples (the small-data regime). An example based on a vortex dipole demonstrates that the physics-agnostic CNN cannot account for never-seen -before samples' rotational equivariance (symmetry), an important property of the SGS term. This shows a major shortcoming of the physics-agnostic CNN in the small-data regime. We show that CNN with DA and GCNN address this issue and each produce accurate and stable data-driven closures in the small-data regime. Despite its simplicity, DA, which adds appropriately rotated samples to the training set, performs as well or in some cases even better than GCNN, which uses a sophisticated equivariance-preserving architecture. EnsCon, which combines structural modeling with aspect of functional modeling, also produces accurate and stable closures in the small-data regime. Overall, GCNN+EnsCon, which combines these two physics constraints, shows the best a posteriori performance in this regime. These results illustrate the power of physics-constrained learning in the small-data regime for accurate and stable LES. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 109 条
[1]  
Achatz U, 1997, J ATMOS SCI, V54, P2452, DOI 10.1175/1520-0469(1997)054<2452:OTCPIT>2.0.CO
[2]  
2
[3]   Energy and enstrophy dissipation in steady state 2d turbulence [J].
Alexakis, Alexandros ;
Doering, Charles R. .
PHYSICS LETTERS A, 2006, 359 (06) :652-657
[4]   Scientific multi-agent reinforcement learning for wall-models of turbulent flows [J].
Bae, H. Jane ;
Koumoutsakos, Petros .
NATURE COMMUNICATIONS, 2022, 13 (01)
[5]  
Batchelor G. K., 1969, Phys. Fluids, V12, pII, DOI [DOI 10.1063/1.1692443, 10.1063/1.1692443]
[6]   A perspective on machine learning methods in turbulence modeling [J].
Beck A. ;
Kurz M. .
GAMM Mitteilungen, 2021, 44 (01)
[7]   Deep neural networks for data-driven LES closure models [J].
Beck, Andrea ;
Flad, David ;
Munz, Claus-Dieter .
JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 398
[8]   A Spectral Stochastic Kinetic Energy Backscatter Scheme and Its Impact on Flow-Dependent Predictability in the ECMWF Ensemble Prediction System [J].
Berner, J. ;
Shutts, G. J. ;
Leutbecher, M. ;
Palmer, T. N. .
JOURNAL OF THE ATMOSPHERIC SCIENCES, 2009, 66 (03) :603-626
[9]   Enforcing Analytic Constraints in Neural Networks Emulating Physical Systems [J].
Beucler, Tom ;
Pritchard, Michael ;
Rasp, Stephan ;
Ott, Jordan ;
Baldi, Pierre ;
Gentine, Pierre .
PHYSICAL REVIEW LETTERS, 2021, 126 (09)
[10]   Applications of Deep Learning to Ocean Data Inference and Subgrid Parameterization [J].
Bolton, Thomas ;
Zanna, Laure .
JOURNAL OF ADVANCES IN MODELING EARTH SYSTEMS, 2019, 11 (01) :376-399