A finite-element-informed neural network for parametric simulation in structural mechanics

被引:13
作者
Le-Duc, Thang [1 ]
Nguyen-Xuan, H. [2 ]
Lee, Jaehong [1 ]
机构
[1] Sejong Univ, Deep Learning Architecture Res Ctr, 209 Neungdong Ro, Seoul 05006, South Korea
[2] HUTECH Univ, CIRTECH Inst, Ho Chi Minh City, Vietnam
基金
新加坡国家研究基金会;
关键词
Finite-element-informed neural network; Parametric simulation; Multi-output regression; Data-driven neural network; Finite element method; Structural mechanics; DEEP; OPTIMIZATION; THEOREM;
D O I
10.1016/j.finel.2022.103904
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this study, we propose a novel deep learning model named as the Finite-element-informed neural network (FEI-NN), inspired from finite element method (FEM) for parametric simulation of static problems in structural mechanics. The approach trains neural networks in the supervised manner, in which parametric variables of structures are considered as input features of network and spatial ones are implicitly embedded into the loss function based on a soft constraint called by finite element analysis (FEA) loss. The training process simultaneously minimizes the empirical risk function and partially respects the mechanical behaviors via the FEA loss defined as a residual calculated from the weak form of the surrogate system scaled from the actual corresponding structure. Besides, a technique developed from batch matrix multiplication is proposed to significantly reduce the time complexity for estimating the FEA loss. The method applies to some typical systems in structural mechanics including truss, beam and plate structures. Through several experiments we statistically demonstrate the superiority of the approach in terms of faster convergence and producing better DNN models in comparison to the traditional data-driven approach concerning both generalization and extrapolation performance.
引用
收藏
页数:22
相关论文
共 56 条
[21]   Error bounds for approximations with deep ReLU neural networks in Ws,p norms [J].
Guehring, Ingo ;
Kutyniok, Gitta ;
Petersen, Philipp .
ANALYSIS AND APPLICATIONS, 2020, 18 (05) :803-859
[22]   A New Scheme for the Tensor Representation [J].
Hackbusch, W. ;
Kuehn, S. .
JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2009, 15 (05) :706-722
[23]   A physics-informed deep learning framework for inversion and surrogate modeling in solid mechanics [J].
Haghighat, Ehsan ;
Raissi, Maziar ;
Moure, Adrian ;
Gomez, Hector ;
Juanes, Ruben .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2021, 379
[24]   RELU DEEP NEURAL NETWORKS AND LINEAR FINITE ELEMENTS [J].
He, Juncai ;
Li, Lin ;
Xu, Jinchao ;
Zheng, Chunyue .
JOURNAL OF COMPUTATIONAL MATHEMATICS, 2020, 38 (03) :502-527
[25]   Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :1026-1034
[26]   ExTensor: An Accelerator for Sparse Tensor Algebra [J].
Hegde, Kartik ;
Asghari-Moghaddam, Hadi ;
Pellauer, Michael ;
Crago, Neal ;
Jaleel, Aamer ;
Solomonik, Edgar ;
Emer, Joel S. ;
Fletcher, Christopher W. .
MICRO'52: THE 52ND ANNUAL IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE, 2019, :319-333
[27]   Deep Neural Networks for Acoustic Modeling in Speech Recognition [J].
Hinton, Geoffrey ;
Deng, Li ;
Yu, Dong ;
Dahl, George E. ;
Mohamed, Abdel-rahman ;
Jaitly, Navdeep ;
Senior, Andrew ;
Vanhoucke, Vincent ;
Patrick Nguyen ;
Sainath, Tara N. ;
Kingsbury, Brian .
IEEE SIGNAL PROCESSING MAGAZINE, 2012, 29 (06) :82-97
[28]   MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366
[29]  
Hughes T. J., 1987, The Finite Element Method: Linear Static and Dynamic Finite Element Analysis
[30]   Physics-informed machine learning [J].
Karniadakis, George Em ;
Kevrekidis, Ioannis G. ;
Lu, Lu ;
Perdikaris, Paris ;
Wang, Sifan ;
Yang, Liu .
NATURE REVIEWS PHYSICS, 2021, 3 (06) :422-440