ORACLE INEQUALITIES FOR SPARSE ADDITIVE QUANTILE REGRESSION IN REPRODUCING KERNEL HILBERT SPACE

被引:45
作者
Lv, Shaogao [1 ]
Lin, Huazhen [2 ]
Lian, Heng [3 ]
Huang, Jian [4 ]
机构
[1] Nanjing Audit Univ, Nanjing, Jiangsu, Peoples R China
[2] Southwestern Univ Finance & Econ, Sch Stat, Ctr Stat Res, Chengdu 611130, Sichuan, Peoples R China
[3] City Univ Hong Kong, Dept Math, Kowloon Tong, Hong Kong, Peoples R China
[4] Univ Iowa, Dept Stat & Actuarial Sci, Iowa City, IA 52242 USA
基金
中国国家自然科学基金;
关键词
Quantile regression; additive models; sparsity; regularization methods; reproducing kernel Hilbert space; VARIABLE SELECTION; MODEL SELECTION; OPTIMAL RATES; LASSO; ESTIMATORS; SHRINKAGE;
D O I
10.1214/17-AOS1567
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper considers the estimation of the sparse additive quantile regression (SAQR) in high-dimensional settings. Given the nonsmooth nature of the quantile loss function and the nonparametric complexities of the component function estimation, it is challenging to analyze the theoretical properties of ultrahigh-dimensional SAQR. We propose a regularized learning approach with a two-fold Lasso-type regularization in a reproducing kernel Hilbert space (RKHS) for SAQR. We establish nonasymptotic oracle inequalities for the excess risk of the proposed estimator without any coherent conditions. If additional assumptions including an extension of the restricted eigenvalue condition are satisfied, the proposed method enjoys sharp oracle rates without the light tail requirement. In particular, the proposed estimator achieves the minimax lower bounds established for sparse additive mean regression. As a by-product, we also establish the concentration inequality for estimating the population mean when the general Lipschitz loss is involved. The practical effectiveness of the new method is demonstrated by competitive numerical results.
引用
收藏
页码:781 / 813
页数:33
相关论文
共 54 条
[1]  
[Anonymous], 2006, Journal of the Royal Statistical Society, Series B
[2]  
[Anonymous], 2002, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
[3]  
[Anonymous], 2008, SUPPORT VECTOR MACHI
[4]   THEORY OF REPRODUCING KERNELS [J].
ARONSZAJN, N .
TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY, 1950, 68 (MAY) :337-404
[5]  
Bach F, 2012, OPTIMIZATION FOR MACHINE LEARNING, P19
[6]  
Bartlett P. L., 2003, Journal of Machine Learning Research, V3, P463, DOI 10.1162/153244303321897690
[7]   Local Rademacher complexities [J].
Bartlett, PL ;
Bousquet, O ;
Mendelson, S .
ANNALS OF STATISTICS, 2005, 33 (04) :1497-1537
[8]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[9]   l1-PENALIZED QUANTILE REGRESSION IN HIGH-DIMENSIONAL SPARSE MODELS [J].
Belloni, Alexandre ;
Chernozhukov, Victor .
ANNALS OF STATISTICS, 2011, 39 (01) :82-130
[10]   SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR [J].
Bickel, Peter J. ;
Ritov, Ya'acov ;
Tsybakov, Alexandre B. .
ANNALS OF STATISTICS, 2009, 37 (04) :1705-1732