A gradient descent algorithm for LASSO

被引:0
作者
Kim, Yongdai [1 ]
Kim, Yuwon [1 ]
Kim, Jinseog [1 ]
机构
[1] Seoul Natl Univ, Dept Stat, Seoul 151742, South Korea
来源
PREDICTION AND DISCOVERY | 2007年 / 443卷
关键词
gradient; LASSO; optimization;
D O I
暂无
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we propose a new optimization algorithm for LASSO-a regression method with the L-1 constraint, based on the gradient descent method. An important advantage of the proposed algorithm over the existing methods is that the former is stabler in particular when the dimension of the input is large. Here, "stabler" means that the proposed algorithm always yields an optimal solution while the existing methods may fail to do so. Simulation results for comparing the proposed algorithm with the QP based algorithm are also given.
引用
收藏
页码:73 / 82
页数:10
相关论文
共 18 条
  • [1] BAKIN S, 1999, THESIS AUSTR NAT U A
  • [2] Atomic decomposition by basis pursuit
    Chen, SSB
    Donoho, DL
    Saunders, MA
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) : 33 - 61
  • [3] Least angle regression - Rejoinder
    Efron, B
    Hastie, T
    Johnstone, I
    Tibshirani, R
    [J]. ANNALS OF STATISTICS, 2004, 32 (02) : 494 - 499
  • [4] Variable selection via nonconcave penalized likelihood and its oracle properties
    Fan, JQ
    Li, RZ
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (456) : 1348 - 1360
  • [5] Grandvalet Y, 1999, ADV NEUR IN, V11, P445
  • [6] Structural modelling with sparse kernels
    Gunn, SR
    Kandola, JS
    [J]. MACHINE LEARNING, 2002, 48 (1-3) : 137 - 163
  • [7] KIM Y, 2004, P 21 INT C MACH LEAR, P473
  • [8] Kim Y, 2006, STAT SINICA, V16, P375
  • [9] KRISHNAPURAM B, 2004, LEARNING SPARSE CLAS
  • [10] Lokhorst J., 1999, LASSO2 S PLUS LIB SO