Deep Data Assimilation: Integrating Deep Learning with Data Assimilation

被引:85
作者
Arcucci, Rossella [1 ]
Zhu, Jiangcheng [2 ]
Hu, Shuang [3 ]
Guo, Yi-Ke [1 ,4 ]
机构
[1] Imperial Coll London, Data Sci Inst, London SW7 2AZ, England
[2] Zhejiang Univ, State Key Lab Ind Control Technol, Hangzhou 310027, Peoples R China
[3] Ningbo Joynext Technol Inc, Ningbo 315000, Peoples R China
[4] Hong Kong Baptist Univ, Dept Comp Sci, Hong Kong, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2021年 / 11卷 / 03期
关键词
data assimilation; deep learning; neural network; MODEL;
D O I
10.3390/app11031114
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
In this paper, we propose Deep Data Assimilation (DDA), an integration of Data Assimilation (DA) with Machine Learning (ML). DA is the Bayesian approximation of the true state of some physical system at a given time by combining time-distributed observations with a dynamic model in an optimal way. We use a ML model in order to learn the assimilation process. In particular, a recurrent neural network, trained with the state of the dynamical system and the results of the DA process, is applied for this purpose. At each iteration, we learn a function that accumulates the misfit between the results of the forecasting model and the results of the DA. Subsequently, we compose this function with the dynamic model. This resulting composition is a dynamic model that includes the features of the DA process and that can be used for future prediction without the necessity of the DA. In fact, we prove that the DDA approach implies a reduction of the model error, which decreases at each iteration; this is achieved thanks to the use of DA in the training process. DDA is very useful in that cases when observations are not available for some time steps and DA cannot be applied to reduce the model error. The effectiveness of this method is validated by examples and a sensitivity study. In this paper, the DDA technology is applied to two different applications: the Double integral mass dot system and the Lorenz system. However, the algorithm and numerical methods that are proposed in this work can be applied to other physics problems that involve other equations and/or state variables.
引用
收藏
页码:1 / 21
页数:21
相关论文
共 45 条
[21]   Long Short-Term Memory Kalman Filters: Recurrent Neural Estimators for Pose Regularization [J].
Coskun, Huseyin ;
Achilles, Felix ;
DiPietro, Robert ;
Navab, Nassir ;
Tombari, Federico .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :5525-5533
[22]   A Variational Data Assimilation Procedure for the Incompressible Navier-Stokes Equations in Hemodynamics [J].
D'Elia, Marta ;
Perego, Mauro ;
Veneziani, Alessandro .
JOURNAL OF SCIENTIFIC COMPUTING, 2012, 52 (02) :340-359
[23]   Challenges and design choices for global weather and climate models based on machine learning [J].
Dueben, Peter D. ;
Bauer, Peter .
GEOSCIENTIFIC MODEL DEVELOPMENT, 2018, 11 (10) :3999-4009
[24]   FINDING STRUCTURE IN TIME [J].
ELMAN, JL .
COGNITIVE SCIENCE, 1990, 14 (02) :179-211
[25]   On the equivalence between Kalman smoothing and weak-constraint four-dimensional variational data assimilation [J].
Fisher, M. ;
Leutbecher, M. ;
Kelly, G. A. .
QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, 2005, 131 (613) :3235-3246
[26]   Machine Learning with Sensitivity Analysis to Determine Key Factors Contributing to Energy Consumption in Cloud Data Centers [J].
Foo, Yong Wee ;
Goh, Cindy ;
Li, Yun .
2016 INTERNATIONAL CONFERENCE ON CLOUD COMPUTING RESEARCH AND INNOVATION - ICCCRI 2016, 2016, :107-113
[27]   Storm-Based Probabilistic Hail Forecasting with Machine Learning Applied to Convection-Allowing Ensembles [J].
Gagne, David John, II ;
McGovern, Amy ;
Haupt, Sue Ellen ;
Sobash, Ryan A. ;
Williams, John K. ;
Xue, Ming .
WEATHER AND FORECASTING, 2017, 32 (05) :1819-1840
[28]  
Geer A. J., 2020, ECMWF TECHNICAL MEMO
[29]  
Hansen P.C., 1998, Soc. Ind. Appl. Math, DOI 10.1137/1.9780898719697
[30]  
Jordan M. I., 1997, Advances in Psychology, V121, P471, DOI DOI 10.1016/S0166-4115(97)80111-2