Bi-selection in the high-dimensional additive hazards regression model

被引:2
作者
Liu, Li [1 ]
Su, Wen [2 ]
Zhao, Xingqiu [3 ]
机构
[1] Wuhan Univ, Sch Math & Stat, Wuhan, Hubei, Peoples R China
[2] Univ Hong Kong, Dept Stat & Actuarial Sci, Hong Kong, Peoples R China
[3] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
关键词
Additive hazards model; high dimension; composite penalty; local coordinate descent algorithm; oracle property; NONCONCAVE PENALIZED LIKELIHOOD; VARIABLE SELECTION; ADAPTIVE LASSO;
D O I
10.1214/21-EJS1799
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this article, we consider a class of regularized regression under the additive hazards model with censored survival data and propose a novel approach to achieve simultaneous group selection, variable selection, and parameter estimation for high-dimensional censored data, by combining the composite penalty and the pseudoscore. We develop a local coordinate descent (LCD) algorithm for efficient computation and subsequently establish the theoretical properties for the proposed selection methods. As a result, the selectors possess both group selection oracle property and variable selection oracle property, and thus enable us to simultaneously identify important groups and important variables within selected groups with high probability. Simulation studies demonstrate that the proposed method and LCD algorithm perform well. A real data example is provided for illustration.
引用
收藏
页码:748 / 772
页数:25
相关论文
共 50 条
[41]   On constrained and regularized high-dimensional regression [J].
Shen, Xiaotong ;
Pan, Wei ;
Zhu, Yunzhang ;
Zhou, Hui .
ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2013, 65 (05) :807-832
[42]   Component Selection in the Additive Regression Model [J].
Cui, Xia ;
Peng, Heng ;
Wen, Songqiao ;
Zhu, Lixing .
SCANDINAVIAN JOURNAL OF STATISTICS, 2013, 40 (03) :491-510
[43]   High-Dimensional Regression and Variable Selection Using CAR Scores [J].
Zuber, Verena ;
Strimmer, Korbinian .
STATISTICAL APPLICATIONS IN GENETICS AND MOLECULAR BIOLOGY, 2011, 10 (01)
[44]   Bayesian Regression Trees for High-Dimensional Prediction and Variable Selection [J].
Linero, Antonio R. .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2018, 113 (522) :626-636
[45]   The sparsity and bias of the lasso selection in high-dimensional linear regression [J].
Zhang, Cun-Hui ;
Huang, Jian .
ANNALS OF STATISTICS, 2008, 36 (04) :1567-1594
[46]   High-dimensional variable selection in regression and classification with missing data [J].
Gao, Qi ;
Lee, Thomas C. M. .
SIGNAL PROCESSING, 2017, 131 :1-7
[47]   A Simple Information Criterion for Variable Selection in High-Dimensional Regression [J].
Pluntz, Matthieu ;
Dalmasso, Cyril ;
Tubert-Bitter, Pascale ;
Ahmed, Ismail .
STATISTICS IN MEDICINE, 2025, 44 (1-2)
[48]   Global optimal model selection for high-dimensional survival analysis [J].
Chu, Guotao ;
Goh, Gyuhyeong .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2021, 91 (18) :3850-3863
[49]   High-dimensional linear model selection motivated by multiple testing [J].
Furmanczyk, Konrad ;
Rejchel, Wojciech .
STATISTICS, 2020, 54 (01) :152-166
[50]   The nonparametric Box-Cox model for high-dimensional regression analysis [J].
Zhou, He ;
Zou, Hui .
JOURNAL OF ECONOMETRICS, 2024, 239 (02)