Some sharp performance bounds for least squares regression with L1 regularization

被引:126
|
作者
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
来源
ANNALS OF STATISTICS | 2009年 / 37卷 / 5A期
关键词
L-1; regularization; Lasso; regression; sparsity; variable selection; parameter estimation; STATISTICAL ESTIMATION; DANTZIG SELECTOR; SPARSITY; LARGER; LASSO;
D O I
10.1214/08-AOS659
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We derive sharp performance bounds for least squares regression With L-1 regularization front parameter estimation accuracy and feature selection quality perspectives. The main result proved for L-1 regularization extends it similar result in [Ann. Statist. 35 (2007) 2313-2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358-2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L-1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.
引用
收藏
页码:2109 / 2144
页数:36
相关论文
共 50 条
  • [1] A Sharp Nonasymptotic Bound and Phase Diagram of L1/2 Regularization
    Zhang, Hai
    Xu, Zong Ben
    Wang, Yao
    Chang, Xiang Yu
    Liang, Yong
    ACTA MATHEMATICA SINICA-ENGLISH SERIES, 2014, 30 (07) : 1242 - 1258
  • [2] A Survey of L1 Regression
    Vidaurre, Diego
    Bielza, Concha
    Larranaga, Pedro
    INTERNATIONAL STATISTICAL REVIEW, 2013, 81 (03) : 361 - 387
  • [3] L1 least squares for sparse high-dimensional LDA
    Li, Yanfang
    Jia, Jinzhu
    ELECTRONIC JOURNAL OF STATISTICS, 2017, 11 (01): : 2499 - 2518
  • [4] Combined l1 and Greedy l0 Penalized Least Squares for Linear Model Selection
    Pokarowski, Piotr
    Mielniczuk, Jan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2015, 16 : 961 - 992
  • [5] Local regularization assisted orthogonal least squares regression
    Chen, S
    NEUROCOMPUTING, 2006, 69 (4-6) : 559 - 585
  • [6] Gene Selection in Cancer Classification Using Sparse Logistic Regression with L1/2 Regularization
    Wu, Shengbing
    Jiang, Hongkun
    Shen, Haiwei
    Yang, Ziyi
    APPLIED SCIENCES-BASEL, 2018, 8 (09):
  • [7] Least angle and l(1) penalized regression: A review
    Hesterberg, Tim
    Choi, Nam Hee
    Meier, Lukas
    Fraley, Chris
    STATISTICS SURVEYS, 2008, 2 : 61 - 93
  • [8] The L1/2 regularization method for variable selection in the Cox model
    Liu, Cheng
    Liang, Yong
    Luan, Xin-Ze
    Leung, Kwong-Sak
    Chan, Tak-Ming
    Xu, Zong-Ben
    Zhang, Hai
    APPLIED SOFT COMPUTING, 2014, 14 : 498 - 503
  • [9] Parameter choices for sparse regularization with the l1 norm
    Liu, Qianru
    Wang, Rui
    Xu, Yuesheng
    Yan, Mingsong
    INVERSE PROBLEMS, 2023, 39 (02)
  • [10] Generalization of l1 constraints for high dimensional regression problems
    Alquier, Pierre
    Hebiri, Mohamed
    STATISTICS & PROBABILITY LETTERS, 2011, 81 (12) : 1760 - 1765