A General Theory of Concave Regularization for High-Dimensional Sparse Estimation Problems

被引:209
作者
Zhang, Cun-Hui [1 ]
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Hill Ctr, Dept Stat & Biostat, Busch Campus, Piscataway, NJ 08854 USA
基金
美国国家科学基金会;
关键词
Concave regularization; sparse recovery; global solution; local solution; approximate solution; oracle inequality; variable selection; NONCONCAVE PENALIZED LIKELIHOOD; VARIABLE SELECTION; DANTZIG SELECTOR; MODEL SELECTION; ADAPTIVE LASSO; REGRESSION; RECOVERY; L(1)-PENALIZATION; SIGNALS;
D O I
10.1214/12-STS399
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Concave regularization methods provide natural procedures for sparse recovery. However, they are difficult to analyze in the high-dimensional setting. Only recently a few sparse recovery results have been established for some specific local solutions obtained via specialized numerical procedures. Still, the fundamental relationship between these solutions such as whether they are identical or their relationship to the global minimizer of the underlying nonconvex formulation is unknown. The current paper fills this conceptual gap by presenting a general theoretical framework showing that, under appropriate conditions, the global solution of nonconvex regularization leads to desirable recovery performance; moreover, under suitable conditions, the global solution corresponds to the unique sparse local solution, which can be obtained via different numerical procedures. Under this unified framework, we present an overview of existing results and discuss their connections. The unified view of this work leads to a more satisfactory treatment of concave high-dimensional sparse estimation procedures, and serves as a guideline for developing further numerical procedures for concave regularization.
引用
收藏
页码:576 / 593
页数:18
相关论文
共 51 条
[1]  
Akaike H., 1973, 2 INTERNAT SYMPOS IN, P267, DOI [DOI 10.1007/978-1-4612-1694-0_15, 10.1007/978-1-4612-1694-0, 10.1007/978-1-4612-0919-5_38]
[2]   Comments on: l1-penalization for mixture regression models [J].
Antoniadis, Anestis .
TEST, 2010, 19 (02) :257-258
[3]   Square-root lasso: pivotal recovery of sparse signals via conic programming [J].
Belloni, A. ;
Chernozhukov, V. ;
Wang, L. .
BIOMETRIKA, 2011, 98 (04) :791-806
[4]   SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR [J].
Bickel, Peter J. ;
Ritov, Ya'acov ;
Tsybakov, Alexandre B. .
ANNALS OF STATISTICS, 2009, 37 (04) :1705-1732
[5]   COORDINATE DESCENT ALGORITHMS FOR NONCONVEX PENALIZED REGRESSION, WITH APPLICATIONS TO BIOLOGICAL FEATURE SELECTION [J].
Breheny, Patrick ;
Huang, Jian .
ANNALS OF APPLIED STATISTICS, 2011, 5 (01) :232-253
[6]  
Bühlmann P, 2011, SPRINGER SER STAT, P1, DOI 10.1007/978-3-642-20192-9
[7]   Sparsity oracle inequalities for the Lasso [J].
Bunea, Florentina ;
Tsybakov, Alexandre ;
Wegkamp, Marten .
ELECTRONIC JOURNAL OF STATISTICS, 2007, 1 :169-194
[8]   Shifting Inequality and Recovery of Sparse Signals [J].
Cai, T. Tony ;
Wang, Lie ;
Xu, Guangwu .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (03) :1300-1308
[9]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[10]  
Candes E, 2007, ANN STAT, V35, P2313, DOI 10.1214/009053606000001523