Progressive, extrapolative machine learning for near-wall turbulence modeling

被引:22
作者
Bin, Yuanwei [1 ,2 ]
Chen, Lihua [3 ]
Huang, George [4 ]
Yang, Xiang I. A. [1 ]
机构
[1] Penn State Univ, Dept Mech Engn, State Coll, PA 16802 USA
[2] Peking Univ, State Key Lab Turbulence & Complex Syst, Beijing 100871, Peoples R China
[3] Zhejiang Univ, Dept Engn Mech, Hangzhou 310027, Peoples R China
[4] Wright State Univ, Dept Mech & Mat Engn, Dayton, OH 45435 USA
基金
中国国家自然科学基金;
关键词
DIRECT NUMERICAL-SIMULATION; CHANNEL FLOW; INVARIANCE;
D O I
10.1103/PhysRevFluids.7.084610
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
Conventional empirical turbulence modeling is progressive: one begins by modeling simple flows and progressively works towards more complex ones. The outcome is a series of nested models, with the next, more complex model accounting for some additional physics relative to the previous, less complex model. The above, however, is not the philosophy of data-enabled turbulence modeling. Data-enabled modeling is one stop: one trains against a group of data, which contains simple and complex flows. The resulting model is the best fit of the training data but does not closely reproduce any particular flow. The differences between the two modeling approaches have left data-enabled models open to criticism: machine learned models do not fully preserve, e.g., the law of the wall (among other empirical facts), and they do not generalize to, e.g., high Reynolds numbers (among other conditions). The purpose of this paper is to respond to and resolve some of these criticisms: we intend to show that the conventional progressive modeling is compatible with data-enabled modeling. The paper hinges on the extrapolation theorem and the neutral neural network theorem. The extrapolation theorem allows us to control a network's behavior when extrapolating and the neutral neural network theorem allows us to augment a network without "catastrophic forgetting." For demonstration purposes, we successively model the flow in the constant stress layer, which is simple; the flow in a channel and a boundary layer, which is more complex; and wall-bounded flow with system rotation, which is even more complex. We show that the more complex models respect the less complex models, and that the models preserve the known empiricism.
引用
收藏
页数:12
相关论文
共 39 条
[1]   Integration of Machine Learning and Computational Fluid Dynamics to Develop Turbulence Models for Improved Low-Pressure Turbine Wake Mixing Prediction [J].
Akolekar, Harshal D. ;
Zhao, Yaomin ;
Sandberg, Richard D. ;
Pacciani, Roberto .
JOURNAL OF TURBOMACHINERY-TRANSACTIONS OF THE ASME, 2021, 143 (12)
[2]   Scientific multi-agent reinforcement learning for wall-models of turbulent flows [J].
Bae, H. Jane ;
Koumoutsakos, Petros .
NATURE COMMUNICATIONS, 2022, 13 (01)
[3]   Sparse identification of multiphase turbulence closures for coupled fluid-particle flows [J].
Beetham, S. ;
Fox, R. O. ;
Capecelatro, J. .
JOURNAL OF FLUID MECHANICS, 2021, 914
[4]   Wall-Modeled Large-Eddy Simulation for Complex Turbulent Flows [J].
Bose, Sanjeeb T. ;
Park, George Ilhwan .
ANNUAL REVIEW OF FLUID MECHANICS, VOL 50, 2018, 50 :535-561
[5]   Machine Learning for Fluid Mechanics [J].
Brunton, Steven L. ;
Noack, Bernd R. ;
Koumoutsakos, Petros .
ANNUAL REVIEW OF FLUID MECHANICS, VOL 52, 2020, 52 :477-508
[6]   Grid-point requirements for large eddy simulation: Chapman's estimates revisited [J].
Choi, Haecheon ;
Moin, Parviz .
PHYSICS OF FLUIDS, 2012, 24 (01)
[7]  
Clauser FH, 1956, Adv. Appl. Mech, V4, P1, DOI [DOI 10.1016/S0065-2156(08)70370-3, 10.1016/S0065-2156(08)70370-3]
[8]   Turbulence Modeling in the Age of Data [J].
Duraisamy, Karthik ;
Iaccarino, Gianluca ;
Xiao, Heng .
ANNUAL REVIEW OF FLUID MECHANICS, VOL 51, 2019, 51 :357-377
[9]   Some Recent Developments in Turbulence Closure Modeling [J].
Durbin, Paul A. .
ANNUAL REVIEW OF FLUID MECHANICS, VOL 50, 2018, 50 :77-103
[10]   Catastrophic forgetting in connectionist networks [J].
French, RM .
TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) :128-135