A General Theory of Concave Regularization for High-Dimensional Sparse Estimation Problems

被引:209
作者
Zhang, Cun-Hui [1 ]
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Hill Ctr, Dept Stat & Biostat, Busch Campus, Piscataway, NJ 08854 USA
基金
美国国家科学基金会;
关键词
Concave regularization; sparse recovery; global solution; local solution; approximate solution; oracle inequality; variable selection; NONCONCAVE PENALIZED LIKELIHOOD; VARIABLE SELECTION; DANTZIG SELECTOR; MODEL SELECTION; ADAPTIVE LASSO; REGRESSION; RECOVERY; L(1)-PENALIZATION; SIGNALS;
D O I
10.1214/12-STS399
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Concave regularization methods provide natural procedures for sparse recovery. However, they are difficult to analyze in the high-dimensional setting. Only recently a few sparse recovery results have been established for some specific local solutions obtained via specialized numerical procedures. Still, the fundamental relationship between these solutions such as whether they are identical or their relationship to the global minimizer of the underlying nonconvex formulation is unknown. The current paper fills this conceptual gap by presenting a general theoretical framework showing that, under appropriate conditions, the global solution of nonconvex regularization leads to desirable recovery performance; moreover, under suitable conditions, the global solution corresponds to the unique sparse local solution, which can be obtained via different numerical procedures. Under this unified framework, we present an overview of existing results and discuss their connections. The unified view of this work leads to a more satisfactory treatment of concave high-dimensional sparse estimation procedures, and serves as a guideline for developing further numerical procedures for concave regularization.
引用
收藏
页码:576 / 593
页数:18
相关论文
共 51 条
[21]  
Knight K, 2000, ANN STAT, V28, P1356
[22]   The Dantzig selector and sparsity oracle inequalities [J].
Koltchinskii, Vladimir .
BERNOULLI, 2009, 15 (03) :799-828
[23]  
Liu J., 2010, ADV NEURAL INFORM PR, V23, P1450
[24]   SOME COMMENTS ON CP [J].
MALLOWS, CL .
TECHNOMETRICS, 1973, 15 (04) :661-675
[25]   SparseNet: Coordinate Descent With Nonconvex Penalties [J].
Mazumder, Rahul ;
Friedman, Jerome H. ;
Hastie, Trevor .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2011, 106 (495) :1125-1138
[26]   High-dimensional graphs and variable selection with the Lasso [J].
Meinshausen, Nicolai ;
Buehlmann, Peter .
ANNALS OF STATISTICS, 2006, 34 (03) :1436-1462
[27]   LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA [J].
Meinshausen, Nicolai ;
Yu, Bin .
ANNALS OF STATISTICS, 2009, 37 (01) :246-270
[28]   On the LASSO and its dual [J].
Osborne, MR ;
Presnell, B ;
Turlach, BA .
JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2000, 9 (02) :319-337
[29]   A new approach to variable selection in least squares problems [J].
Osborne, MR ;
Presnell, B ;
Turlach, BA .
IMA JOURNAL OF NUMERICAL ANALYSIS, 2000, 20 (03) :389-403
[30]   Minimax Rates of Estimation for High-Dimensional Linear Regression Over lq-Balls [J].
Raskutti, Garvesh ;
Wainwright, Martin J. ;
Yu, Bin .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (10) :6976-6994