CALIBRATING NONCONVEX PENALIZED REGRESSION IN ULTRA-HIGH DIMENSION

被引:127
作者
Wang, Lan [1 ]
Kim, Yongdai [2 ]
Li, Runze [3 ,4 ]
机构
[1] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
[2] Seoul Natl Univ, Dept Stat, Seoul, South Korea
[3] Penn State Univ, Dept Stat, University Pk, PA 16802 USA
[4] Penn State Univ, Methodol Ctr, University Pk, PA 16802 USA
基金
中国国家自然科学基金; 美国国家科学基金会; 新加坡国家研究基金会;
关键词
High-dimensional regression; LASSO; MCP; SCAD; variable selection; penalized least squares; CLIPPED ABSOLUTE DEVIATION; VARIABLE SELECTION; MODEL SELECTION; DIVERGING NUMBER; ADAPTIVE LASSO; LIKELIHOOD; REGULARIZATION; SHRINKAGE; CRITERIA;
D O I
10.1214/13-AOS1159
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We investigate high-dimensional nonconvex penalized regression, where the number of covariates may grow at an exponential rate. Although recent asymptotic theory established that there exists a local minimum possessing the oracle property under general conditions, it is still largely an open problem how to identify the oracle estimator among potentially multiple local minima. There are two main obstacles: (1) due to the presence of multiple minima, the solution path is nonunique and is not guaranteed to contain the oracle estimator; (2) even if a solution path is known to contain the oracle estimator, the optimal tuning parameter depends on many unknown factors and is hard to estimate. To address these two challenging issues, we first prove that an easy-to-calculate calibrated CCCP algorithm produces a consistent solution path which contains the oracle estimator with probability approaching one. Furthermore, we propose a high-dimensional BIC criterion and show that it can be applied to the solution path to select the optimal tuning parameter which asymptotically identifies the oracle estimator. The theory for a general class of nonconvex penalties in the ultra-high dimensional setup is established when the random errors follow the sub-Gaussian distribution. Monte Carlo studies confirm that the calibrated CCCP algorithm combined with the proposed high-dimensional BIC has desirable performance in identifying the underlying sparsity pattern for high-dimensional data analysis.
引用
收藏
页码:2505 / 2536
页数:32
相关论文
共 39 条
[11]  
Huang J, 2008, STAT SINICA, V18, P1603
[12]   Variable selection using MM algorithms [J].
Hunter, DR ;
Li, RZ .
ANNALS OF STATISTICS, 2005, 33 (04) :1617-1642
[13]   Global optimality of nonconvex penalized estimators [J].
Kim, Yongdai ;
Kwon, Sunghoon .
BIOMETRIKA, 2012, 99 (02) :315-325
[14]  
Kim Y, 2012, J MACH LEARN RES, V13, P1037
[15]   Smoothly Clipped Absolute Deviation on High Dimensions [J].
Kim, Yongdai ;
Choi, Hosik ;
Oh, Hee-Seok .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2008, 103 (484) :1665-1673
[16]   LARGE SAMPLE PROPERTIES OF THE SCAD-PENALIZED MAXIMUM LIKELIHOOD ESTIMATION ON HIGH DIMENSIONS [J].
Kwon, Sunghoon ;
Kim, Yongdai .
STATISTICA SINICA, 2012, 22 (02) :629-653
[17]   Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators [J].
Lounici, Karim .
ELECTRONIC JOURNAL OF STATISTICS, 2008, 2 :90-102
[18]   LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA [J].
Meinshausen, Nicolai ;
Yu, Bin .
ANNALS OF STATISTICS, 2009, 37 (01) :246-270
[19]  
MIKOSCH T, 1990, PROBAB MATH STAT-POL, V11, P169
[20]   Trefoil factor family peptide 2 acts pro-proliferative and pro-apoptotic in the murine retina [J].
Paunel-Goerguelue, Adnana N. ;
Franke, Andreas G. ;
Paulsen, Friedrich P. ;
Duenker, Nicole .
HISTOCHEMISTRY AND CELL BIOLOGY, 2011, 135 (05) :461-473