Shrinkage tuning parameter selection with a diverging number of parameters

被引:321
作者
Wang, Hansheng [1 ]
Li, Bo [2 ]
Leng, Chenlei [3 ]
机构
[1] Peking Univ, Guanghua Sch Management, Beijing 100871, Peoples R China
[2] Tsinghua Univ, Beijing 100084, Peoples R China
[3] Natl Univ Singapore, Singapore 117548, Singapore
基金
中国国家自然科学基金;
关键词
Bayesian information criterion; Diverging number of parameters; Lasso; Smoothly clipped absolute deviation; NONCONCAVE PENALIZED LIKELIHOOD; LINEAR-MODEL SELECTION; ORACLE PROPERTIES; ADAPTIVE LASSO; BRIDGE;
D O I
10.1111/j.1467-9868.2008.00693.x
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co-worker have demonstrated that the tuning parameters selected by a Bayesian information criterion type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators. Consequently, our theoretical results further enlarge not only the scope of applicabilityation criterion type criteria but also that of those shrinkage estimation methods.
引用
收藏
页码:671 / 683
页数:13
相关论文
共 20 条
[1]  
Bai Z.D., 2006, SPECTRAL ANAL LARGE
[2]  
Fan J., 2006, P INT C MATH
[3]   Nonconcave penalized likelihood with a diverging number of parameters [J].
Fan, JQ ;
Peng, H .
ANNALS OF STATISTICS, 2004, 32 (03) :928-961
[4]   Variable selection via nonconcave penalized likelihood and its oracle properties [J].
Fan, JQ ;
Li, RZ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (456) :1348-1360
[5]   Penalized regressions: The bridge versus the lasso [J].
Fu, WJJ .
JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 1998, 7 (03) :397-416
[6]  
HUANG J, 2007, 374 U IOW DEP STAT A
[7]   Asymptotic properties of bridge estimators in sparse high-dimensional regression models [J].
Huang, Jian ;
Horowitz, Joel L. ;
Ma, Shuangge .
ANNALS OF STATISTICS, 2008, 36 (02) :587-613
[8]   ESTIMATING DIMENSION OF A MODEL [J].
SCHWARZ, G .
ANNALS OF STATISTICS, 1978, 6 (02) :461-464
[9]  
Shao J, 1997, STAT SINICA, V7, P221
[10]   Regression model selection - a residual likelihood approach [J].
Shi, PD ;
Tsai, CL .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2002, 64 :237-252