Group subset selection for linear regression

被引:13
作者
Guo, Yi [1 ]
Berman, Mark [1 ]
Gao, Junbin [2 ]
机构
[1] CSIRO Computat Informat, N Ryde, NSW 1670, Australia
[2] Charles Sturt Univ, Sch Comp & Math, Bathurst, NSW 2795, Australia
关键词
Subset selection; Group Lasso; Linear regression; Screening; VARIABLE SELECTION; ALGORITHMS; REGULARIZATION; SHRINKAGE; LASSO; MODEL;
D O I
10.1016/j.csda.2014.02.005
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Two fast group subset selection (GSS) algorithms for the linear regression model are proposed in this paper. GSS finds the best combinations of groups up to a specified size minimising the residual sum of squares. This imposes an 10 constraint on the regression coefficients in a group context. It is a combinatorial optimisation problem with NP complexity. To make the exhaustive search very efficient, the GSS algorithms are built on QR decomposition and branch-and-bound techniques. They are suitable for middle scale problems where finding the most accurate solution is essential. In the application motivating this research, it is natural to require that the coefficients of some of the variables within groups satisfy some constraints (e.g. non-negativity). Therefore the GSS algorithms (optionally) calculate the model coefficient estimates during the exhaustive search in order to screen combinations that do not meet the constraints. The faster of the two GSS algorithms is compared to an extension to the original group Lasso, called the constrained group Lasso (CGL), which is proposed to handle convex constraints and to remove orthogonality requirements on the variables within each group. CGL is a convex relaxation of the GSS problem and hence more straightforward to solve. Although CGL is inferior to GSS in terms of group selection accuracy, it is a fast approximation to GSS if the optimal regularisation parameter can be determined efficiently and, in some cases, it may serve as a screening procedure to reduce the number of groups. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:39 / 52
页数:14
相关论文
共 34 条
[1]  
[Anonymous], 2009, P 26 ANN INT C MACH, DOI DOI 10.1145/1553374.1553431
[2]  
[Anonymous], 2006, Journal of the Royal Statistical Society, Series B
[3]  
[Anonymous], 2011, P ADV NEUR INF PROC
[4]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[5]  
Berman M., 2011, An Unmixing Algorithm Based on a Large Library of Shortwave Infrared Spectra
[6]  
Berman Mark., 1999, Proceedings of the 13th International Conference on Applied Geologic Remote Sensing, Vancouver, V1, P222
[7]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[8]   Genetic algorithms as a method for variable selection in multiple linear regression and partial least squares regression, with applications to pyrolysis mass spectrometry [J].
Broadhurst, D ;
Goodacre, R ;
Jones, A ;
Rowland, JJ ;
Kell, DB .
ANALYTICA CHIMICA ACTA, 1997, 348 (1-3) :71-86
[9]   Genetic algorithm guided selection: Variable selection and subset selection [J].
Cho, SJ ;
Hermsmeier, MA .
JOURNAL OF CHEMICAL INFORMATION AND COMPUTER SCIENCES, 2002, 42 (04) :927-936
[10]  
Das A., 2011, P 28 INT C MACH LEAR, P1057