Robust Adaptive Lasso method for parameter's estimation and variable selection in high-dimensional sparse models

被引:16
作者
Wahid, Abdul [1 ]
Khan, Dost Muhammad [1 ]
Hussain, Ijaz [2 ]
机构
[1] Abdul Wali Khan Univ Mardan, Dept Stat, Khyber Pakhtunkhwa, Pakistan
[2] Quiad I Azam Univ Islamabad, Dept Stat, Islamabad, Pakistan
来源
PLOS ONE | 2017年 / 12卷 / 08期
关键词
LIKELIHOOD; REGRESSION; SHRINKAGE; FACE;
D O I
10.1371/journal.pone.0183518
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
High dimensional data are commonly encountered in various scientific fields and pose great challenges to modern statistical analysis. To address this issue different penalized regression procedures have been introduced in the litrature, but these methods cannot cope with the problem of outliers and leverage points in the heavy tailed high dimensional data. For this purppose, a new Robust Adaptive Lasso (RAL) method is proposed which is based on pearson residuals weighting scheme. The weight function determines the compatibility of each observations and downweight it if they are inconsistent with the assumed model. It is observed that RAL estimator can correctly select the covariates with non-zero coefficients and can estimate parameters, simultaneously, not only in the presence of influential observations, but also in the presence of high multicolliearity. We also discuss the model selection oracle property and the asymptotic normality of the RAL. Simulations findings and real data examples also demonstrate the better performance of the proposed penalized regression approach.
引用
收藏
页数:17
相关论文
共 32 条
[1]   A one-step robust estimator for regression based on the weighted likelihood reweighting scheme [J].
Agostinelli, C ;
Markatou, M .
STATISTICS & PROBABILITY LETTERS, 1998, 37 (04) :341-350
[2]   THE TRADE-OFF BETWEEN ROBUSTNESS AND EFFICIENCY AND THE EFFECT OF MODEL SMOOTHING IN MINIMUM DISPARITY INFERENCE [J].
BASU, A ;
SARKAR, S .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 1994, 50 (3-4) :173-185
[3]   MINIMUM DISPARITY ESTIMATION IN THE ERRORS-IN-VARIABLES MODEL [J].
BASU, A ;
SARKAR, S .
STATISTICS & PROBABILITY LETTERS, 1994, 20 (01) :69-73
[4]   l1-PENALIZED QUANTILE REGRESSION IN HIGH-DIMENSIONAL SPARSE MODELS [J].
Belloni, Alexandre ;
Chernozhukov, Victor .
ANNALS OF STATISTICS, 2011, 39 (01) :82-130
[5]   A new weighted likelihood approach [J].
Biswas, Adhidev ;
Roy, Tania ;
Majumder, Suman ;
Basu, Ayanendranath .
STAT, 2015, 4 (01) :97-107
[6]   Penalized composite quasi-likelihood for ultrahigh dimensional variable selection [J].
Bradic, Jelena ;
Fan, Jianqing ;
Wang, Weiwei .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2011, 73 :325-349
[7]  
Bühlmann P, 2011, SPRINGER SER STAT, P1, DOI 10.1007/978-3-642-20192-9
[8]  
Candes E, 2007, ANN STAT, V35, P2313, DOI 10.1214/009053606000001523
[9]   Least angle regression - Rejoinder [J].
Efron, B ;
Hastie, T ;
Johnstone, I ;
Tibshirani, R .
ANNALS OF STATISTICS, 2004, 32 (02) :494-499
[10]  
Fan J., 2014, ARXIV PREPRINT, P1410