Some sharp performance bounds for least squares regression with L1 regularization

被引:127
作者
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
关键词
L-1; regularization; Lasso; regression; sparsity; variable selection; parameter estimation; STATISTICAL ESTIMATION; DANTZIG SELECTOR; SPARSITY; LARGER; LASSO;
D O I
10.1214/08-AOS659
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We derive sharp performance bounds for least squares regression With L-1 regularization front parameter estimation accuracy and feature selection quality perspectives. The main result proved for L-1 regularization extends it similar result in [Ann. Statist. 35 (2007) 2313-2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358-2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L-1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.
引用
收藏
页码:2109 / 2144
页数:36
相关论文
共 50 条
[31]   SMOOTHING APPROXIMATIONS FOR LEAST SQUARES MINIMIZATION WITH L1-NORM REGULARIZATION FUNCTIONAL [J].
Nkansah, Henrietta ;
Benyah, Francis ;
Amankwah, Henry .
INTERNATIONAL JOURNAL OF ANALYSIS AND APPLICATIONS, 2021, 19 (02) :264-279
[32]   Sparse N-way partial least squares by L1-penalization [J].
Hervas, D. ;
Prats-Montalban, J. M. ;
Garcia-Canaveras, J. C. ;
Lahoz, A. ;
Ferrer, A. .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2019, 185 :85-91
[33]   Some theoretical aspects of partial least squares regression [J].
Helland, IS .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2001, 58 (02) :97-107
[34]   The Group-Lasso: l1,∞ Regularization versus l1,2 Regularization [J].
Vogt, Julia E. ;
Roth, Volker .
PATTERN RECOGNITION, 2010, 6376 :252-261
[35]   Iterative L1/2 Regularization Algorithm for Variable Selection in the Cox Proportional Hazards Model [J].
Liu, Cheng ;
Liang, Yong ;
Luan, Xin-Ze ;
Leung, Kwong-Sak ;
Chan, Tak-Ming ;
Xu, Zong-Ben ;
Zhang, Hai .
ADVANCES IN SWARM INTELLIGENCE, ICSI 2012, PT II, 2012, 7332 :11-17
[36]   A novel L1/2 regularization shooting method for Cox's proportional hazards model [J].
Luan, Xin-Ze ;
Liang, Yong ;
Liu, Cheng ;
Leung, Kwong-Sak ;
Chan, Tak-Ming ;
Xu, Zong-Ben ;
Zhang, Hai .
SOFT COMPUTING, 2014, 18 (01) :143-152
[37]   L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs [J].
Neykov, Matey ;
Liu, Jun S. ;
Cai, Tianxi .
JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
[38]   A distributed sparse logistic regression with L1/2 regularization for microarray biomarker discovery in cancer classification [J].
Ai, Ning ;
Yang, Ziyi ;
Yuan, Haoliang ;
Ouyang, Dong ;
Miao, Rui ;
Ji, Yuhan ;
Liang, Yong .
SOFT COMPUTING, 2023, 27 (05) :2537-2552
[39]   Training Compact DNNs with l1/2 Regularization [J].
Tang, Anda ;
Niu, Lingfeng ;
Miao, Jianyu ;
Zhang, Peng .
PATTERN RECOGNITION, 2023, 136
[40]   Frequency-Band and Electrode-Channel Selection for Motion Discrimination from Electroencephalography using l1 - Constrained Least Squares [J].
Nakatani, Shintaro ;
Murakami, Motoki ;
Araki, Nozomu ;
Sakurama, Kazunori ;
Nishida, Shinichiro ;
Mabuchi, Kunihiko .
2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, :2141-2145