CONVERGENCE RATES OF SPECTRAL REGULARIZATION METHODS: A COMPARISON BETWEEN ILL-POSED INVERSE PROBLEMS AND STATISTICAL KERNEL LEARNING

被引:2
作者
Guastavino, Sabrina [1 ]
Benvenuto, Federico [1 ]
机构
[1] Univ Genoa, Dept Math, I-16146 Genoa, Italy
基金
欧盟地平线“2020”;
关键词
linear ill-posed inverse problems; statistical kernel learning; spectral regularization; convergence rates; NEURAL-NETWORKS; ALGORITHMS; REGRESSION; OPERATORS;
D O I
10.1137/19M1256038
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper we study the relation between convergence rates of spectral regularization methods under Holder-type source conditions resulting from the theory of ill-posed inverse problems, when the noise level delta goes to 0, and convergence rates resulting from statistical kernel learning, when the number of samples n goes to infinity. Toward this aim, we introduce a family of hybrid estimators in the statistical learning context whose convergence rates have the following properties: first, they are equal to those of spectral methods, and second, they are connected to the rates of spectral regularization in ill-posed inverse problems, provided that a suitable inverse proportionality relation between n and delta holds true. This family of estimators allows us to convert upper rates depending on n to upper rates depending on delta and to convert lower rates vice versa, quantifying their deviation. The analysis is carried out under general source conditions in the case the rank of the forward operator is both finite and infinite, and, in the latter case, both by not making any assumptions on the eigenvalues and by assuming a polynomial eigenvalue decay.
引用
收藏
页码:3504 / 3529
页数:26
相关论文
共 40 条
[1]   Solving ill-posed inverse problems using iterative deep neural networks [J].
Adler, Jonas ;
Oktem, Ozan .
INVERSE PROBLEMS, 2017, 33 (12)
[2]   Optimal Convergence Rates Results for Linear Inverse Problems in Hilbert Spaces [J].
Albani, V. ;
Elbau, P. ;
de Hoop, M. V. ;
Scherzer, O. .
NUMERICAL FUNCTIONAL ANALYSIS AND OPTIMIZATION, 2016, 37 (05) :521-540
[3]  
[Anonymous], 2017, Frontiers in Applied Mathematics and Statistics
[4]   THEORY OF REPRODUCING KERNELS [J].
ARONSZAJN, N .
TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY, 1950, 68 (MAY) :337-404
[5]   On regularization algorithms in learning theory [J].
Bauer, Frank ;
Pereverzev, Sergei ;
Rosasco, Lorenzo .
JOURNAL OF COMPLEXITY, 2007, 23 (01) :52-72
[6]   Convergence rates of general regularization methods for statistical inverse problems and applications [J].
Bissantz, N. ;
Hohage, T. ;
Munk, A. ;
Ruymgaart, F. .
SIAM JOURNAL ON NUMERICAL ANALYSIS, 2007, 45 (06) :2610-2636
[7]   Optimal Rates for Regularization of Statistical Inverse Learning Problems [J].
Blanchard, Gilles ;
Muecke, Nicole .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2018, 18 (04) :971-1013
[8]  
Bottcher A., 2006, Appli- cable Analysis, V85, P555
[9]  
Caponnetto A, 2007, FOUND COMPUT MATH, V7, P331, DOI [10.1007/s10208-006-0196-8, 10.1007/S10208-006-0196-8]
[10]  
Cucker F, 2007, C MO AP C M, P1, DOI 10.1017/CBO9780511618796