Some sharp performance bounds for least squares regression with L1 regularization

被引:127
作者
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
关键词
L-1; regularization; Lasso; regression; sparsity; variable selection; parameter estimation; STATISTICAL ESTIMATION; DANTZIG SELECTOR; SPARSITY; LARGER; LASSO;
D O I
10.1214/08-AOS659
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We derive sharp performance bounds for least squares regression With L-1 regularization front parameter estimation accuracy and feature selection quality perspectives. The main result proved for L-1 regularization extends it similar result in [Ann. Statist. 35 (2007) 2313-2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358-2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L-1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.
引用
收藏
页码:2109 / 2144
页数:36
相关论文
共 50 条
[21]   Efficient Hardware Implementation of the l1 - Regularized Least Squares for IoT Edge Computing [J].
Baali, Hamza ;
Djelouat, Hamza ;
Amira, Abbess ;
Bensaali, Faycal ;
Zhai, Xiaojun .
2017 IEEE 17TH INTERNATIONAL CONFERENCE ON UBIQUITOUS WIRELESS BROADBAND (ICUWB), 2017,
[22]   Asymptotic properties for combined L1 and concave regularization [J].
Fan, Yingying ;
Lv, Jinchi .
BIOMETRIKA, 2014, 101 (01) :57-70
[23]   Robust Fault Detection Based on l1 Regularization [J].
Kim, Young-Man .
ADVANCED INTELLIGENT SYSTEMS, 2021, 3 (04)
[24]   Sparse kernel logistic regression based on L1/2 regularization [J].
Xu Chen ;
Peng ZhiMing ;
Jing WenFeng .
SCIENCE CHINA-INFORMATION SCIENCES, 2013, 56 (04) :1-16
[25]   Variable selection for functional regression models via the L1 regularization [J].
Matsui, Hidetoshi ;
Konishi, Sadanori .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2011, 55 (12) :3304-3310
[26]   Representer Theorems for Sparsity-Promoting l1 Regularization [J].
Unser, Michael ;
Fageot, Julien ;
Gupta, Harshit .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (09) :5167-5180
[27]   Least squares regression with l1-regularizer in sum space [J].
Xu, Yong-Li ;
Han, Min ;
Dong, Xue-mei ;
Wang, Min .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2014, 261 :394-405
[28]   MULTIPLE-INFLATION POISSON MODEL WITH L1 REGULARIZATION [J].
Su, Xiaogang ;
Fan, Juanjuan ;
Levine, Richard A. ;
Tan, Xianming ;
Tripathi, Arvind .
STATISTICA SINICA, 2013, 23 (03) :1071-1090
[29]   Collaborative Spectrum Sensing via L1/2 Regularization [J].
Liu, Zhe ;
Li, Feng ;
Duan, WenLei .
IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2015, E98A (01) :445-449
[30]   L1/2 regularization [J].
ZongBen Xu ;
Hai Zhang ;
Yao Wang ;
XiangYu Chang ;
Yong Liang .
Science China Information Sciences, 2010, 53 :1159-1169