Support estimation and sign recovery in high-dimensional heteroscedastic mean regression

被引:0
作者
Hermann, Philipp [1 ]
Holzmann, Hajo [1 ]
机构
[1] Philipps Univ Marburg, Dept Math & Comp Sci, Hans Meerweinstr 6, D-35043 Marburg, Germany
关键词
convergence rates; Huber loss function; knockoff filter; robust high-dimensional regression; sign recovery; support estimation; variable selection; SPARSITY RECOVERY; LASSO; CONSISTENCY; SYMMETRY;
D O I
10.1111/sjos.12772
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
A current strand of research in high-dimensional statistics deals with robustifying the methodology with respect to deviations from the pervasive light-tail assumptions. In this article, we consider a linear mean regression model with random design and potentially heteroscedastic, heavy-tailed errors, and investigate support estimation and sign recovery. We use a strictly convex, smooth variant of the Huber loss function with tuning parameters depending on the parameters of the problem, as well as the adaptive LASSO penalty for computational efficiency. For the resulting estimator, we show sign-consistency and optimal rates of convergence in the & ell;infinity$$ {\ell}_{\infty } $$ norm as in the homoscedastic, light-tailed setting. In our simulations, we also connect to the recent literature on variable selection with the thresholded LASSO and false discovery rate control using knockoffs and indicate the relevance of the Donoho-Tanner transition curve for variable selection. The simulations illustrate the favorable numerical performance of the proposed methodology.
引用
收藏
页数:35
相关论文
共 50 条
  • [41] Asymptotic properties on high-dimensional multivariate regression M-estimation
    Ding, Hao
    Qin, Shanshan
    Wu, Yuehua
    Wu, Yaohua
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2021, 183
  • [42] Semiparametric efficient estimation in high-dimensional partial linear regression models
    Fu, Xinyu
    Huang, Mian
    Yao, Weixin
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2024, 51 (03) : 1259 - 1287
  • [43] Penalized least-squares estimation for regression coefficients in high-dimensional partially linear models
    Ni, Huey-Fan
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2012, 142 (02) : 379 - 389
  • [44] Inference in regression discontinuity designs with high-dimensional covariates
    Kreiss, Alexander
    Rothe, C.
    [J]. ECONOMETRICS JOURNAL, 2023, 26 (02) : 105 - 123
  • [45] A systematic review on model selection in high-dimensional regression
    Lee, Eun Ryung
    Cho, Jinwoo
    Yu, Kyusang
    [J]. JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2019, 48 (01) : 1 - 12
  • [46] High-dimensional linear regression via implicit regularization
    Zhao, Peng
    Yang, Yun
    He, Qiao-Chu
    [J]. BIOMETRIKA, 2022, 109 (04) : 1033 - 1046
  • [47] High-Dimensional Gaussian Graphical Regression Models with Covariates
    Zhang, Jingfei
    Li, Yi
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (543) : 2088 - 2100
  • [48] THE SPARSE LAPLACIAN SHRINKAGE ESTIMATOR FOR HIGH-DIMENSIONAL REGRESSION
    Huang, Jian
    Ma, Shuangge
    Li, Hongzhe
    Zhang, Cun-Hui
    [J]. ANNALS OF STATISTICS, 2011, 39 (04) : 2021 - 2046
  • [49] ADMM for High-Dimensional Sparse Penalized Quantile Regression
    Gu, Yuwen
    Fan, Jun
    Kong, Lingchen
    Ma, Shiqian
    Zou, Hui
    [J]. TECHNOMETRICS, 2018, 60 (03) : 319 - 331
  • [50] Consistent group selection in high-dimensional linear regression
    Wei, Fengrong
    Huang, Jian
    [J]. BERNOULLI, 2010, 16 (04) : 1369 - 1384