Orthogonality-projection-based penalized variable selection for high-dimensional partially linear models

被引:0
作者
Yang, Yiping [1 ,2 ]
Zhao, Peixin [1 ,2 ]
Zhang, Jun [3 ]
机构
[1] Chongqing Technol & Business Univ, Sch Math & Stat, Chongqing 400067, Peoples R China
[2] Chongqing Technol & Business Univ, Chongqing Key Lab Stat Intelligent Comp & Monitori, Chongqing 400067, Peoples R China
[3] Shenzhen Univ, Sch Math Sci, Shenzhen 518060, Peoples R China
关键词
Partially linear model; Orthogonality-projection; Variable selection; Diabetes data; Gene expression data; VARYING-COEFFICIENT MODELS; ADAPTIVE ELASTIC-NET; REGRESSION; LIKELIHOOD; LASSO;
D O I
10.1016/j.apm.2024.115785
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The Smoothly Clipped Absolute Deviation Net variable selection process is introduced for high- dimensional partially linear models. To mitigate the influence of nonparametric components on variable selection and estimation of parametric components, B-spline and orthogonality- projection techniques are employed, thereby constructing the Smoothly Clipped Absolute Deviation Net penalized least squares objective function. The proposed variable selection process not only identifies relevant variables but also simultaneously provides estimates for the selected variables. Additionally, we present estimators for the nonparametric components. Under certain regularization conditions, the variable selection procedure for the proposed parametric components has been proven to exhibit the grouping effect and oracle property, ensuring accurate variable selection. Concurrently, the estimation of nonparametric components achieves the optimal convergence rate for nonparametric estimation. To efficiently implement this process, we propose an algorithm based on local linear approximation and utilize K-fold cross-validation to select the penalty parameter. Our simulation studies and analyses of real data comprehensively cover scenarios where the number of variables p is either less than or far greater than the sample size n.
引用
收藏
页数:21
相关论文
共 40 条
[1]   Penalized logistic regression with the adaptive LASSO for gene selection in high-dimensional cancer classification [J].
Algamal, Zakariya Yahya ;
Lee, Muhammad Hisyam .
EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (23) :9326-9332
[2]   Semi-profiled distributed estimation for high-dimensional partially linear model [J].
Bao, Yajie ;
Ren, Haojie .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2023, 188
[3]  
Cao Z., 2021, Can. J. Stat., V51, P38
[4]   Variance estimation in high-dimensional linear models [J].
Dicker, Lee H. .
BIOMETRIKA, 2014, 101 (02) :269-284
[5]   Least angle regression - Rejoinder [J].
Efron, B ;
Hastie, T ;
Johnstone, I ;
Tibshirani, R .
ANNALS OF STATISTICS, 2004, 32 (02) :494-499
[6]   Sure independence screening for ultrahigh dimensional feature space [J].
Fan, Jianqing ;
Lv, Jinchi .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2008, 70 :849-883
[7]   Variable selection via nonconcave penalized likelihood and its oracle properties [J].
Fan, JQ ;
Li, RZ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (456) :1348-1360
[8]   Model detection and estimation for varying coefficient panel data models with fixed effects [J].
Feng, Sanying ;
He, Wenqi ;
Li, Feng .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2020, 152
[9]   Semiparametric efficient estimation in high-dimensional partial linear regression models [J].
Fu, Xinyu ;
Huang, Mian ;
Yao, Weixin .
SCANDINAVIAN JOURNAL OF STATISTICS, 2024, 51 (03) :1259-1287
[10]   Consistent inference for biased sub-model of high-dimensional partially linear model [J].
Gai, Yujie ;
Lin, Lu ;
Wang, Xiuli .
JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2011, 141 (05) :1888-1898