Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm

被引:1
作者
Xie, Wanling [1 ,2 ]
Yang, Hu [3 ]
机构
[1] Hunan Univ Technol & Business, Sch Math & Stat, Changsha 410205, Hunan, Peoples R China
[2] Hunan Univ Technol & Business, Key Lab Hunan Prov Stat Learning & Intelligent Co, Changsha 410205, Hunan, Peoples R China
[3] Chongqing Univ, Coll Math & Stat, Chongqing 401331, Peoples R China
基金
中国国家自然科学基金;
关键词
Collinearity; Group square-root elastic net; Group sparsity; Noise level; Oracle inequality; MODEL SELECTION CONSISTENCY; GENERALIZED LINEAR-MODELS; VARIABLE SELECTION; ORACLE INEQUALITIES; LASSO; REGRESSION; REGULARIZATION;
D O I
10.1007/s10182-022-00443-x
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this work, we propose a novel group selection method called Group Square-Root Elastic Net. It is based on square-root regularization with a group elastic net penalty, i.e., a l(2,1) + l(2) penalty. As a type of square-root-based procedure, one distinct feature is that the estimator is independent of the unknown noise level sigma, which is non-trivial to estimate under the high-dimensional setting, especially when p >> n. In many applications, the estimator is expected to be sparse, not in an irregular way, but rather in a structured manner. It makes the proposed method very attractive to tackle both high-dimensionality and structured sparsity. We study the correct subset recovery under a Group Elastic Net Irrepresentable Condition. Both the slow rate bounds and fast rate bounds are established, the latter under the Restricted Eigenvalue assumption and Gaussian noise assumption. To implement, a fast algorithm based on the scaled multivariate thresholding-based iterative selection idea is introduced with proved convergence. A comparative study examines the superiority of our approach against alternatives.
引用
收藏
页码:469 / 507
页数:39
相关论文
共 39 条
[1]   Error bounds for compressed sensing algorithms with group sparsity: A unified approach [J].
Ahsen, M. Eren ;
Vidyasagar, M. .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2017, 43 (02) :212-232
[2]   Convex multi-task feature learning [J].
Argyriou, Andreas ;
Evgeniou, Theodoros ;
Pontil, Massimiliano .
MACHINE LEARNING, 2008, 73 (03) :243-272
[3]   Square-root lasso: pivotal recovery of sparse signals via conic programming [J].
Belloni, A. ;
Chernozhukov, V. ;
Wang, L. .
BIOMETRIKA, 2011, 98 (04) :791-806
[4]   PIVOTAL ESTIMATION VIA SQUARE-ROOT LASSO IN NONPARAMETRIC REGRESSION [J].
Belloni, Alexandre ;
Chernozhukov, Victor ;
Wang, Lie .
ANNALS OF STATISTICS, 2014, 42 (02) :757-788
[5]   SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR [J].
Bickel, Peter J. ;
Ritov, Ya'acov ;
Tsybakov, Alexandre B. .
ANNALS OF STATISTICS, 2009, 37 (04) :1705-1732
[6]  
Bühlmann P, 2011, SPRINGER SER STAT, P1, DOI 10.1007/978-3-642-20192-9
[7]   The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms [J].
Bunea, Florentina ;
Lederer, Johannes ;
She, Yiyuan .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (02) :1313-1325
[8]   Honest variable selection in linear and logistic regression models via l1 and l1 + l2 penalization [J].
Bunea, Florentina .
ELECTRONIC JOURNAL OF STATISTICS, 2008, 2 :1153-1194
[9]   High dimensional variable selection via tilting [J].
Cho, Haeran ;
Fryzlewicz, Piotr .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2012, 74 :593-622
[10]   On the prediction performance of the Lasso [J].
Dalalyan, Arnak S. ;
Hebiri, Mohamed ;
Lederer, Johannes .
BERNOULLI, 2017, 23 (01) :552-581