Some sharp performance bounds for least squares regression with L1 regularization

被引:126
作者
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
关键词
L-1; regularization; Lasso; regression; sparsity; variable selection; parameter estimation; STATISTICAL ESTIMATION; DANTZIG SELECTOR; SPARSITY; LARGER; LASSO;
D O I
10.1214/08-AOS659
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We derive sharp performance bounds for least squares regression With L-1 regularization front parameter estimation accuracy and feature selection quality perspectives. The main result proved for L-1 regularization extends it similar result in [Ann. Statist. 35 (2007) 2313-2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358-2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L-1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.
引用
收藏
页码:2109 / 2144
页数:36
相关论文
共 50 条
[41]   L1/2regularization [J].
XU ZongBen ZHANG Hai WANG Yao CHANG XiangYu LIANG Yong Institute of Information and System Science Xian Jiaotong University Xian China Department of Mathematics Northwest University Xian China University of Science and Technology Macau China .
ScienceChina(InformationSciences), 2010, 53 (06) :1159-1169
[42]   Truncated L1 Regularized Linear Regression: Theory and Algorithm [J].
Dai, Mingwei ;
Dai, Shuyang ;
Huang, Junjun ;
Kang, Lican ;
Lu, Xiliang .
COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2021, 30 (01) :190-209
[43]   Adaptive L1/2 Shooting Regularization Method for Survival Analysis Using Gene Expression Data [J].
Liu, Xiao-Ying ;
Liang, Yong ;
Xu, Zong-Ben ;
Zhang, Hai ;
Leung, Kwong-Sak .
SCIENTIFIC WORLD JOURNAL, 2013,
[44]   Variable selection in convex quantile regression: L1-norm or L0-norm regularization? [J].
Dai, Sheng .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2023, 305 (01) :338-355
[45]   Model-averaged l1 regularization using Markov chain Monte Carlo model composition [J].
Fraley, Chris ;
Percival, Daniel .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2015, 85 (06) :1090-1101
[46]   Application of L1/2 regularization logistic method in heart disease diagnosis [J].
Zhang, Bowen ;
Chai, Hua ;
Yang, Ziyi ;
Liang, Yong ;
Chu, Gejin ;
Liu, Xiaoying .
BIO-MEDICAL MATERIALS AND ENGINEERING, 2014, 24 (06) :3447-3454
[47]   Structural damage identification based on substructure sensitivity and l1 sparse regularization [J].
Zhou, Shumei ;
Bao, Yuequan ;
Li, Hui .
SENSORS AND SMART STRUCTURES TECHNOLOGIES FOR CIVIL, MECHANICAL, AND AEROSPACE SYSTEMS 2013, 2013, 8692
[48]   RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS [J].
Li, Lin ;
Fan, Qinwei ;
Zhou, Li .
JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022
[49]   3-D Image-Domain Least-Squares Reverse Time Migration With L1 Norm Constraint and Total Variation Regularization [J].
Zhang, Wei ;
Gao, Jinghuai ;
Cheng, Yuanfeng ;
Su, Chaoguang ;
Liang, Hongxian ;
Zhu, Jianbing .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
[50]   Bayesian relative composite quantile regression approach of ordinal latent regression model with L1/2 regularization [J].
Yu-Zhu, Tian ;
Chun-Ho, Wu ;
Ling-Nan, Tai ;
Zhi-Bao, Mian ;
Mao-Zai, Tian .
STATISTICAL ANALYSIS AND DATA MINING, 2024, 17 (02)