Asymptotic consistency of the WSINDy algorithm in the limit of continuum data

被引:0
作者
Messenger, Daniel A. [1 ]
Bortz, David M. [1 ]
机构
[1] Univ Colorado Boulder, Appl Math, Boulder, CO 80309 USA
基金
美国国家科学基金会;
关键词
data-driven modeling; equation learning; weak formulation; asymptotic consistency; DISCOVERING GOVERNING EQUATIONS; IDENTIFICATION; CONVERGENCE;
D O I
10.1093/imanum/drae086
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this work we study the asymptotic consistency of the weak-form sparse identification of nonlinear dynamics algorithm (WSINDy) in the identification of differential equations from noisy samples of solutions. We prove that the WSINDy estimator is unconditionally asymptotically consistent for a wide class of models that includes the Navier-Stokes, Kuramoto-Sivashinsky and Sine-Gordon equations. We thus provide a mathematically rigorous explanation for the observed robustness to noise of weak-form equation learning. Conversely, we also show that, in general, the WSINDy estimator is only conditionally asymptotically consistent, yielding discovery of spurious terms with probability one if the noise level exceeds a critical threshold $\sigma _{c}$. We provide explicit bounds on $\sigma _{c}$ in the case of Gaussian white noise and we explicitly characterize the spurious terms that arise in the case of trigonometric and/or polynomial libraries. Furthermore, we show that, if the data is suitably denoised (a simple moving average filter is sufficient), then asymptotic consistency is recovered for models with locally-Lipschitz, polynomial-growth nonlinearities. Our results reveal important aspects of weak-form equation learning, which may be used to improve future algorithms. We demonstrate our findings numerically using the Lorenz system, the cubic oscillator, a viscous Burgers-growth model and a Kuramoto-Sivashinsky-type high-order PDE.
引用
收藏
页数:49
相关论文
共 50 条
[21]   Consistency and asymptotic normality of some subspace algorithms for systems without observed inputs [J].
Bauer, D ;
Deistler, M ;
Scherrer, W .
AUTOMATICA, 1999, 35 (07) :1243-1254
[22]   Non-Asymptotic Uniform Rates of Consistency for k-NN Regression [J].
Jiang, Heinrich .
THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, :3999-4006
[23]   Consistency and Asymptotic Property of a Weighted Least Squares Method for Networked Control Systems [J].
Zhang, Cong ;
Ye, Hao .
CHINESE JOURNAL OF CHEMICAL ENGINEERING, 2014, 22 (07) :754-761
[24]   The asymptotic tails of limit distributions of continuous-time Markov chains [J].
Xu, Chuang ;
Hansen, Mads Christian ;
Wiuf, Carsten .
ADVANCES IN APPLIED PROBABILITY, 2024, 56 (02) :693-734
[25]   Asymptotic limit to a shock for BGK models using relative entropy method [J].
Kwon, Young-Sam ;
Vasseur, Alexis F. .
NONLINEARITY, 2015, 28 (03) :531-543
[26]   Asymptotic Limit to Shocks for Scalar Balance Laws Using Relative Entropy [J].
Kwon, Young-Sam .
ABSTRACT AND APPLIED ANALYSIS, 2014,
[27]   Inferring gene regulatory networks from gene expression data by path consistency algorithm based on conditional mutual information [J].
Zhang, Xiujun ;
Zhao, Xing-Ming ;
He, Kun ;
Lu, Le ;
Cao, Yongwei ;
Liu, Jingdong ;
Hao, Jin-Kao ;
Liu, Zhi-Ping ;
Chen, Luonan .
BIOINFORMATICS, 2012, 28 (01) :98-104
[28]   Asymptotic predictive inference with exchangeable data [J].
Berti, Patrizia ;
Pratelli, Luca ;
Rigo, Pietro .
BRAZILIAN JOURNAL OF PROBABILITY AND STATISTICS, 2018, 32 (04) :815-833
[29]   On the consistency of mode estimate for spatially dependent data [J].
Younso, Ahmad .
METRIKA, 2023, 86 (03) :343-372
[30]   On the continuum limit for the discrete nonlinear Schrodinger equation on a large finite cubic lattice [J].
Hong, Younghun ;
Kwak, Chulkwang ;
Yang, Changhun .
NONLINEAR ANALYSIS-THEORY METHODS & APPLICATIONS, 2023, 227