Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space

被引:49
|
作者
Luo, Shan [1 ]
Chen, Zehua [2 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Math, Shanghai 200030, Peoples R China
[2] Natl Univ Singapore, Dept Stat & Appl Probabil, Singapore 117548, Singapore
关键词
Extended BIC; Oracle property; Selection consistency; Sparse high-dimensional linear models; NONCONCAVE PENALIZED LIKELIHOOD; ORTHOGONAL MATCHING PURSUIT; VARIABLE SELECTION; MODEL SELECTION; SIGNAL RECOVERY; ORACLE PROPERTIES; ADAPTIVE LASSO; LINEAR-MODELS; REGRESSION; SHRINKAGE;
D O I
10.1080/01621459.2013.877275
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this article, we propose a method called sequential Lasso (SLasso) for feature selection in sparse high-dimensional linear models. The SLasso selects features by sequentially solving partially penalized least squares problems where the features selected in earlier steps are not penalized. The SLasso uses extended BIC (EBIC) as the stopping rule. The procedure stops when EBIC reaches a minimum. The asymptotic properties of SLasso are considered when the dimension of the feature space is ultra high and the number of relevant feature diverges. We show that, with probability converging to 1, the SLasso first selects all the relevant features before any irrelevant features can be selected, and that the EBIC decreases until it attains the minimum at the model consisting of exactly all the relevant features and then begins to increase. These results establish the selection consistency of SLasso. The SLasso estimators of the final model are ordinary least squares estimators. The selection consistency implies the oracle property of SLasso. The asymptotic distribution of the SLasso estimators with diverging number of relevant features is provided. The SLasso is compared with other methods by simulation studies, which demonstrates that SLasso is a desirable approach having an edge over the other methods. The SLasso together with the other methods are applied to a microarray data for mapping disease genes. Supplementary materials for this article are available online.
引用
收藏
页码:1229 / 1240
页数:12
相关论文
共 50 条
  • [21] HDSI: High dimensional selection with interactions algorithm on feature selection and testing
    Jain, Rahi
    Xu, Wei
    PLOS ONE, 2021, 16 (02):
  • [22] Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
    Luo, Shan
    Chen, Zehua
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2013, 143 (03) : 494 - 504
  • [23] Semiparametric Bayesian information criterion for model selection in ultra-high dimensional additive models
    Lian, Heng
    JOURNAL OF MULTIVARIATE ANALYSIS, 2014, 123 : 304 - 310
  • [24] Combination of Ensembles of Regularized Regression Models with Resampling-Based Lasso Feature Selection in High Dimensional Data
    Patil, Abhijeet R.
    Kim, Sangjin
    MATHEMATICS, 2020, 8 (01)
  • [25] Enumerate Lasso Solutions for Feature Selection
    Hara, Satoshi
    Maehara, Takanori
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1985 - 1991
  • [26] LASSO-type variable selection methods for high-dimensional data
    Fu, Guanghui
    Wang, Pan
    ADVANCES IN COMPUTATIONAL MODELING AND SIMULATION, PTS 1 AND 2, 2014, 444-445 : 604 - 609
  • [27] High-order covariate interacted Lasso for feature selection
    Zhang, Zhihong
    Tian, Yiyang
    Bai, Lu
    Xiahou, Jianbing
    Hancock, Edwin
    PATTERN RECOGNITION LETTERS, 2017, 87 : 139 - 146
  • [28] On Supervised Feature Selection from High Dimensional Feature Spaces
    Yang, Yijing
    Wang, Wei
    Fu, Hongyu
    Kuo, C. -C. Jay
    APSIPA TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING, 2022, 11 (01)
  • [29] Feature selection in finite mixture of sparse normal linear models in high-dimensional feature space
    Khalili, Abbas
    Chen, Jiahua
    Lin, Shili
    BIOSTATISTICS, 2011, 12 (01) : 156 - 172
  • [30] High-dimensional sign-constrained feature selection and grouping
    Qin, Shanshan
    Ding, Hao
    Wu, Yuehua
    Liu, Feng
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2021, 73 (04) : 787 - 819