Asymptotic Theory of l1-Regularized PDE Identification from a Single Noisy Trajectory

被引:4
作者
He, Yuchen [1 ]
Suh, Namjoon [2 ]
Huo, Xiaoming [2 ]
Kang, Sung Ha [3 ]
Mei, Yajun [2 ]
机构
[1] Shanghai Jiao Tong Univ, Inst Nat Sci, Shanghai, Peoples R China
[2] Georgia Inst Technol, Sch Ind & Syst Engn, Atlanta, GA 30332 USA
[3] Georgia Inst Technol, Sch Math, Atlanta, GA 30332 USA
关键词
parital differential equation (PDE); lasso; pseudo least square; signed-support recovery; primal-dual witness construction; local-polynomial regression; STRONG UNIFORM CONSISTENCY; UNCERTAINTY PRINCIPLES; VARIABLE SELECTION; REGRESSION; RECOVERY; REPRESENTATIONS; EQUATIONS; MODELS; RATES; WEAK;
D O I
10.1137/21M1398884
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We provide a formal theoretical analysis on the PDE identification via the '1-regularized pseudo least square method from the statistical point of view. In this article, we assume that the differential equation governing the dynamic system can be represented as a linear combination of various linear and nonlinear differential terms. Under noisy observations, we employ local-polynomial fitting for estimating state variables and apply the '1 penalty for model selection. Our theory proves that the classical mutual incoherence condition on the feature matrix F and the beta* min-condition for the ground-truth signal beta* are sufficient for the signed-support recovery of the '1-PsLS method. We run numerical experiments on two popular PDE models, the viscous Burgers and the Korteweg-de Vries (KdV) equations, and the results from the experiments corroborate our theoretical predictions.
引用
收藏
页码:1012 / 1036
页数:25
相关论文
共 47 条
[21]   Evaluation of linkage disequilibrium in wheat with an L1-regularized sparse Markov network [J].
Gota Morota ;
Daniel Gianola .
Theoretical and Applied Genetics, 2013, 126 :1991-2002
[22]   Understanding Protein Dynamics with L1-Regularized Reversible Hidden Markov Models [J].
McGibbon, Robert T. ;
Ramsundar, Bharath ;
Sultan, Mohammad M. ;
Kiss, Gert ;
Pande, Vijay S. .
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 :1197-1205
[23]   L1-Regularized Least Squares Sparse Extreme Learning Machine for Classification [J].
Fakhr, Mohamed Waleed ;
Youssef, El-Nasser S. ;
El-Mahallawy, Mohamed S. .
2015 INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY RESEARCH (ICTRC), 2015, :222-225
[24]   An algorithm for quadratic l1-regularized optimization with a flexible active-set strategy [J].
Solntsev, Stefan ;
Nocedal, Jorge ;
Byrd, Richard H. .
OPTIMIZATION METHODS & SOFTWARE, 2015, 30 (06) :1213-1237
[25]   Fast Estimation of L1-Regularized Linear Models in the Mass-Univariate Setting [J].
Holger Mohr ;
Hannes Ruge .
Neuroinformatics, 2021, 19 :385-392
[26]   A Fast Hybrid Algorithm for Large-Scale l1-Regularized Logistic Regression [J].
Shi, Jianing ;
Yin, Wotao ;
Osher, Stanley ;
Sajda, Paul .
JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 :713-741
[27]   AN L1-REGULARIZED NAIVE BAYES-INSPIRED CLASSIFIER FOR DISCARDING REDUNDANT AND IRRELEVANT PREDICTORS [J].
Vidaurre, Diego ;
Bielza, Concha ;
Larranaga, Pedro .
INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2013, 22 (04)
[28]   A Comparison of Optimization Methods and Software for Large-scale L1-regularized Linear Classification [J].
Yuan, Guo-Xun ;
Chang, Kai-Wei ;
Hsieh, Cho-Jui ;
Lin, Chih-Jen .
JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 :3183-3234
[29]   Ising model selection using l1-regularized linear regression: a statistical mechanics analysis [J].
Meng, Xiangming ;
Obuchi, Tomoyuki ;
Kabashima, Yoshiyuki .
JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (11)
[30]   Ising Model Selection Using l1-Regularized Linear Regression: A Statistical Mechanics Analysis [J].
Meng, Xiangming ;
Obuchi, Tomoyuki ;
Kabashima, Yoshiyuki .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34