Honest variable selection in linear and logistic regression models via l1 and l1 + l2 penalization

被引:87
作者
Bunea, Florentina [1 ]
机构
[1] Florida State Univ, Dept Stat, Tallahassee, FL 32306 USA
关键词
Lasso; elastic net; l(1) and l(1) + l(2) regularization; penalty; sparse; consistent; variable selection; regression; generalized linear models; logistic regression; high dimensions;
D O I
10.1214/08-EJS287
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper investigates correct variable selection in finite samples via l(1) and l(1) + l(2) type penalization schemes. The asymptotic consistency of variable selection immediately follows from this analysis. We focus on logistic and linear regression models. The following questions are central to our paper: given a level of confidence 1 - delta, under which assumptions on the design matrix, for which strength of the signal and for what values of the tuning parameters can we identify the true model at the given level of confidence? Formally, if (I) over cap is an estimate of the true variable set I*, we study conditions under which P((I) over cap = I*) >= 1 - delta, for a given sample size n, number of parameters 34 and confidence 1 - delta. We show that in identifiable models, both methods can recover coefficients of size 1/root n, up to small multiplicative constants and logarithmic factors in M and 1/delta. The advantage of the l(1) + l(2) penalization over the l(1) is minor for the variable selection problem, for the models we consider here. Whereas the former estimates are unique, and become more stable for highly correlated data matrices as one increases the tuning parameter of the l(2) part, too large an increase in this parameter value may preclude variable selection.
引用
收藏
页码:1153 / 1194
页数:42
相关论文
共 26 条
[1]   TESTS FOR LINEAR TRENDS IN PROPORTIONS AND FREQUENCIES [J].
ARMITAGE, P .
BIOMETRICS, 1955, 11 (03) :375-386
[2]  
BICKEL P, ANN STAT IN PRESS
[3]  
Bunea F, 2008, Pushing the Limits of Contemporary Statistics: Contributions in Honor ofJayanta K. Ghosh, P122
[4]   Sparsity oracle inequalities for the Lasso [J].
Bunea, Florentina ;
Tsybakov, Alexandre ;
Wegkamp, Marten .
ELECTRONIC JOURNAL OF STATISTICS, 2007, 1 :169-194
[5]   Aggregation for gaussian regression [J].
Bunea, Florentina ;
Tsybakov, Alexandre B. ;
Wegkamp, Marten H. .
ANNALS OF STATISTICS, 2007, 35 (04) :1674-1697
[6]  
CANDES E, 2007, NEAR IDEAL MODEL SEL
[7]  
Devroye Luc, 2001, COMBINATORIAL METHOD
[8]   Stable recovery of sparse overcomplete representations in the presence of noise [J].
Donoho, DL ;
Elad, M ;
Temlyakov, VN .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (01) :6-18
[9]   Variable selection via nonconcave penalized likelihood and its oracle properties [J].
Fan, JQ ;
Li, RZ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (456) :1348-1360
[10]   Best subset selection, persistence in high-dimensional statistical learning and optimization under l1 constraint [J].
Greenshtein, Eitan .
ANNALS OF STATISTICS, 2006, 34 (05) :2367-2386