Sample complexity bounds for the local convergence of least squares approximation

被引:0
|
作者
Trunschke, Philipp [1 ]
机构
[1] Univ Nantes, Ecole Cent Nantes, LMJL UMR CNRS 6629, 2 Chemin Houssiniere,BP 92208, F-44322 Nantes 3, France
关键词
Least squares approximation; sample complexity; tensor networks; manifolds; positive reach; RECOVERY; EQUATIONS; CONES;
D O I
10.1142/S0219530524500271
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider the problem of approximating a function in a general nonlinear subset of L-2, when only a weighted Monte Carlo estimate of the L-2-norm is accessible. The concept of sample complexity, i.e. the number of sample points necessary to achieve a prescribed error with high probability, is of particular interest in this setting. Reasonable worst-case bounds for this quantity exist only for particular model classes, like linear spaces or sets of sparse vectors. However, the existing bounds are very pessimistic for more general sets, like tensor networks or neural networks. Restricting the model class to a neighborhood of the best approximation allows us to derive improved worst-case bounds for the sample complexity. When the considered neighborhood is a manifold with positive local reach, its sample complexity can be estimated through the sample complexities of the tangent and normal spaces and the manifold's curvature.
引用
收藏
页码:139 / 167
页数:29
相关论文
共 50 条