OpenFOAMGPT: A retrieval-augmented large language model (LLM) agent for OpenFOAM-based computational fluid dynamics

被引:2
作者
Pandey, Sandeep [1 ]
Xu, Ran [2 ]
Wang, Wenkang [3 ]
Chu, Xu [2 ,4 ]
机构
[1] Tech Univ Ilmenau, Inst Thermodynam & Fluid Mech, D-98684 Ilmenau, Germany
[2] Univ Stuttgart, Cluster Excellence SimTech, Stuttgart, Germany
[3] Beihang Univ, Int Res Inst Multidisciplinary Sci, Beijing 100191, Peoples R China
[4] Univ Exeter, Fac Environm Sci & Econ, Exeter EX4 4QF, England
关键词
HEAT-TRANSFER;
D O I
10.1063/5.0257555
中图分类号
O3 [力学];
学科分类号
08 ; 0801 ;
摘要
This work presents a large language model (LLM)-based agent OpenFOAMGPT tailored for OpenFOAM-centric computational fluid dynamics (CFD) simulations, leveraging two foundation models from OpenAI: the GPT-4o (GPT means Generative Pre-trained Transformer) and a chain-of-thought-enabled o1 preview model. Both agents demonstrate success across multiple tasks. While the price of token with o1 model is six times as that of GPT-4o, it consistently exhibits superior performance in handling complex tasks, from zero-shot/few-shot case setup to boundary condition modifications, zero-shot turbulence model adjustments, and zero-shot code translation. Through an iterative correction loop, the agent efficiently addressed single-phase and multiphase flow, heat transfer, Reynolds-averaged Navier-Stokes modeling, large eddy simulation, and other engineering scenarios, often converging in a limited number of iterations at low token costs. To embed domain-specific knowledge, we employed a retrieval-augmented generation pipeline, demonstrating how preexisting simulation setups can further specialize the agent for subdomains such as energy and aerospace. Despite the great performance of the agent, human oversight remains crucial for ensuring accuracy and adapting to shifting contexts. Fluctuations in model performance over time suggest the need for monitoring in mission-critical applications. Although our demonstrations focus on OpenFOAM, the adaptable nature of this framework opens the door to developing LLM-driven agents into a wide range of solvers and codes. By streamlining CFD simulations, this approach has the potential to accelerate both fundamental research and industrial engineering advancements.
引用
收藏
页数:10
相关论文
共 41 条
[1]  
Achiam J., 2023, GPT 4 TECHNICAL REPO, DOI DOI 10.48550/ARXIV.2303.08774
[2]   Toward discretization-consistent closure schemes for large eddy simulation using reinforcement learning [J].
Beck, Andrea ;
Kurz, Marius .
PHYSICS OF FLUIDS, 2023, 35 (12)
[3]   Deep neural networks for data-driven LES closure models [J].
Beck, Andrea ;
Flad, David ;
Munz, Claus-Dieter .
JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 398
[4]   Cephalo: Multi-Modal Vision-Language Models for Bio-Inspired Materials Analysis and Design [J].
Buehler, Markus J. .
ADVANCED FUNCTIONAL MATERIALS, 2024, 34 (49)
[5]   MechGPT, a Language-Based Strategy for Mechanics and Materials Modeling That Connects Knowledge Across Scales, Disciplines, and Modalities [J].
Buehler, Markus J. .
APPLIED MECHANICS REVIEWS, 2024, 76 (02)
[6]   Heat transfer prediction of supercritical water with artificial neural networks [J].
Chang, Wanli ;
Chu, Xu ;
Fareed, Anes Fatima Binte Shaik ;
Pandey, Sandeep ;
Luo, Jiayu ;
Weigand, Bernhard ;
Laurien, Eckart .
APPLIED THERMAL ENGINEERING, 2018, 131 :815-824
[7]  
Chen YX, 2024, Arxiv, DOI arXiv:2407.21320
[8]   Evaluating GPT Models for Automated Literature Screening in Wastewater-Based Epidemiology [J].
Chibwe, Kaseba ;
Mantilla-Calderon, David ;
Ling, Fangqiong .
ACS ENVIRONMENTAL AU, 2024, 5 (01) :61-68
[9]   Non-intrusive, transferable model for coupled turbulent channel-porous media flow based upon neural networks [J].
Chu, Xu ;
Pandey, Sandeep .
PHYSICS OF FLUIDS, 2024, 36 (02)
[10]   A computationally light data-driven approach for heat transfer and hydraulic characteristics modeling of supercritical fluids: From DNS to DNN [J].
Chu, Xu ;
Chang, Wanli ;
Pandey, Sandeep ;
Luo, Jiayu ;
Weigand, Bernhard ;
Laurien, Eckart .
INTERNATIONAL JOURNAL OF HEAT AND MASS TRANSFER, 2018, 123 :629-636