Penalized wavelet estimation and robust denoising for irregular spaced data

被引:3
作者
Amato, Umberto [1 ]
Antoniadis, Anestis [2 ,3 ]
De Feis, Italia [4 ]
Gijbels, Irene [5 ,6 ]
机构
[1] CNR, Ist Sci Applicate & Sistemi Intelligenti, Naples, Italy
[2] Univ Grenoble Alpes, Dept Stat, Lab Jean Kuntzmann, Grenoble, France
[3] Univ Cape Town, Dept Stat Sci, Cape Town, South Africa
[4] CNR, Ist Applicaz Calcolo M Picone, Naples, Italy
[5] Katholieke Univ Leuven, Dept Math, Leuven, Belgium
[6] Katholieke Univ Leuven, Leuven Stat Res Ctr LStat, Leuven, Belgium
关键词
Wavelets; Nonparametric regression; Proximal algorithms; Thresholding; Robust fitting; VARIABLE SELECTION; ADAPTIVE LASSO; REGRESSION; LIKELIHOOD; SHRINKAGE; REGULARIZATION; CONSISTENCY;
D O I
10.1007/s00180-021-01174-4
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Nonparametric univariate regression via wavelets is usually implemented under the assumptions of dyadic sample size, equally spaced fixed sample points, and i.i.d. normal errors. In this work, we propose, study and compare some wavelet based nonparametric estimation methods designed to recover a one-dimensional regression function for data that not necessary possess the above requirements. These methods use appropriate regularizations by penalizing the decomposition of the unknown regression function on a wavelet basis of functions evaluated on the sampling design. Exploiting the sparsity of wavelet decompositions for signals belonging to homogeneous Besov spaces, we use some efficient proximal gradient descent algorithms, available in recent literature, for computing the estimates with fast computation times. Our wavelet based procedures, in both the standard and the robust regression case have favorable theoretical properties, thanks in large part to the separability nature of the (non convex) regularization they are based on. We establish asymptotic global optimal rates of convergence under weak conditions. It is known that such rates are, in general, unattainable by smoothing splines or other linear nonparametric smoothers. Lastly, we present several experiments to examine the empirical performance of our procedures and their comparisons with other proposals available in the literature. An interesting regression analysis of some real data applications using these procedures unambiguously demonstrate their effectiveness.
引用
收藏
页码:1621 / 1651
页数:31
相关论文
共 37 条
[1]   Penalised robust estimators for sparse and high-dimensional linear models [J].
Amato, Umberto ;
Antoniadis, Anestis' ;
De Feis, Italia ;
Gijbels, Irene .
STATISTICAL METHODS AND APPLICATIONS, 2021, 30 (01) :1-48
[2]   Regularization of wavelet approximations - Rejoinder [J].
Antoniadis, A ;
Fan, J .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (455) :964-967
[3]   Penalized likelihood regression for generalized linear models with non-quadratic penalties [J].
Antoniadis, Anestis ;
Gijbels, Irene ;
Nikolova, Mila .
ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2011, 63 (03) :585-615
[4]   Wavelet methods in statistics: Some recent developments and their applications [J].
Antoniadis, Anestis .
STATISTICS SURVEYS, 2007, 1 :16-55
[5]  
Averkamp R, 2003, ANN STAT, V31, P110
[6]  
BRUCE AG, 1994, P SOC PHOTO-OPT INS, V2242, P325, DOI 10.1117/12.170036
[7]  
Cai TT, 1998, ANN STAT, V26, P1783
[8]   Wavelet estimation for samples with random uniform design [J].
Cai, TT ;
Brown, LD .
STATISTICS & PROBABILITY LETTERS, 1999, 42 (03) :313-321
[9]  
Chen SSB, 2001, SIAM REV, V43, P129, DOI [10.1137/S003614450037906X, 10.1137/S1064827596304010]
[10]   An iterative thresholding algorithm for linear inverse problems with a sparsity constraint [J].
Daubechies, I ;
Defrise, M ;
De Mol, C .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2004, 57 (11) :1413-1457