Least squares approximation with a diverging number of parameters

被引:6
|
作者
Leng, Chenlei [2 ]
Li, Bo [1 ]
机构
[1] Tsinghua Univ, Sch Econ & Management, Beijing 100084, Peoples R China
[2] Natl Univ Singapore, Dept Stat & Appl Probabil, Singapore 117548, Singapore
基金
中国国家自然科学基金;
关键词
NONCONCAVE PENALIZED LIKELIHOOD; ADAPTIVE LASSO; SELECTION; SHRINKAGE; ALGORITHM;
D O I
10.1016/j.spl.2009.10.015
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Regularized regression with the l(1) penalty is a popular approach for variable selection and coefficient estimation. For a unified treatment of thee l(1)-constrained model selection, Wang and Leng (2007) proposed the least squares approximation method (LSA) for a fixed dimension. LSA makes use of a quadratic expansion of the loss function and takes full advantage of the fast Lasso algorithm in Efron et al. (2004). In this paper, we extend the fixed dimension LSA to the situation with a diverging number of parameters. We show that LSA possesses the oracle properties under appropriate conditions when the number of variables grows with the sample size. We propose a new tuning parameter selection method which achieves the oracle properties. Extensive simulation studies confirmed the theoretical results. (C) 2009 Elsevier B.V. All rights reserved.
引用
收藏
页码:254 / 261
页数:8
相关论文
共 50 条