Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space

被引:49
|
作者
Luo, Shan [1 ]
Chen, Zehua [2 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Math, Shanghai 200030, Peoples R China
[2] Natl Univ Singapore, Dept Stat & Appl Probabil, Singapore 117548, Singapore
关键词
Extended BIC; Oracle property; Selection consistency; Sparse high-dimensional linear models; NONCONCAVE PENALIZED LIKELIHOOD; ORTHOGONAL MATCHING PURSUIT; VARIABLE SELECTION; MODEL SELECTION; SIGNAL RECOVERY; ORACLE PROPERTIES; ADAPTIVE LASSO; LINEAR-MODELS; REGRESSION; SHRINKAGE;
D O I
10.1080/01621459.2013.877275
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this article, we propose a method called sequential Lasso (SLasso) for feature selection in sparse high-dimensional linear models. The SLasso selects features by sequentially solving partially penalized least squares problems where the features selected in earlier steps are not penalized. The SLasso uses extended BIC (EBIC) as the stopping rule. The procedure stops when EBIC reaches a minimum. The asymptotic properties of SLasso are considered when the dimension of the feature space is ultra high and the number of relevant feature diverges. We show that, with probability converging to 1, the SLasso first selects all the relevant features before any irrelevant features can be selected, and that the EBIC decreases until it attains the minimum at the model consisting of exactly all the relevant features and then begins to increase. These results establish the selection consistency of SLasso. The SLasso estimators of the final model are ordinary least squares estimators. The selection consistency implies the oracle property of SLasso. The asymptotic distribution of the SLasso estimators with diverging number of relevant features is provided. The SLasso is compared with other methods by simulation studies, which demonstrates that SLasso is a desirable approach having an edge over the other methods. The SLasso together with the other methods are applied to a microarray data for mapping disease genes. Supplementary materials for this article are available online.
引用
收藏
页码:1229 / 1240
页数:12
相关论文
共 50 条
  • [31] Feature selection for high-dimensional temporal data
    Michail Tsagris
    Vincenzo Lagani
    Ioannis Tsamardinos
    BMC Bioinformatics, 19
  • [32] Forward Variable Selection for Sparse Ultra-High Dimensional Varying Coefficient Models
    Cheng, Ming-Yen
    Honda, Toshio
    Zhang, Jin-Ting
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2016, 111 (515) : 1209 - 1221
  • [33] NONPARAMETRIC INDEPENDENCE SCREENING AND STRUCTURE IDENTIFICATION FOR ULTRA-HIGH DIMENSIONAL LONGITUDINAL DATA
    Cheng, Ming-Yen
    Honda, Toshio
    Li, Jialiang
    Peng, Heng
    ANNALS OF STATISTICS, 2014, 42 (05) : 1819 - 1849
  • [34] MODEL SELECTION AND STRUCTURE SPECIFICATION IN ULTRA-HIGH DIMENSIONAL GENERALISED SEMI-VARYING COEFFICIENT MODELS
    Li, Degui
    Ke, Yuan
    Zhang, Wenyang
    ANNALS OF STATISTICS, 2015, 43 (06) : 2676 - 2705
  • [35] A sequential feature selection procedure for high-dimensional Cox proportional hazards model
    Ke Yu
    Shan Luo
    Annals of the Institute of Statistical Mathematics, 2022, 74 : 1109 - 1142
  • [36] Feature Selection in High-Dimensional Space with Applications to Gene Expression Data
    Pantha, Nishan
    Ramasubramanian, Muthukumaran
    Gurung, Iksha
    Maskey, Manil
    Sanders, Lauren M.
    Casaletto, James
    Costes, Sylvain V.
    SOUTHEASTCON 2024, 2024, : 6 - 15
  • [37] Feature selection for pattern recognition by LASSO and thresholding methods - a comparison
    Libal, Urszula
    2011 16TH INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS, 2011, : 168 - 173
  • [38] Feature Selection for Neural Networks Using Group Lasso Regularization
    Zhang, Huaqing
    Wang, Jian
    Sun, Zhanquan
    Zurada, Jacek M.
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (04) : 659 - 673
  • [39] High-dimensional genomic feature selection with the ordered stereotype logit model
    Seffernick, Anna Eames
    Mrozek, Krzysztof
    Nicolet, Deedra
    Stone, Richard M.
    Eisfeld, Ann-Kathrin
    Byrd, John C.
    Archer, Kellie J.
    BRIEFINGS IN BIOINFORMATICS, 2022, 23 (06)
  • [40] Cluster feature selection in high-dimensional linear models
    Lin, Bingqing
    Pang, Zhen
    Wang, Qihua
    RANDOM MATRICES-THEORY AND APPLICATIONS, 2018, 7 (01)