Support estimation and sign recovery in high-dimensional heteroscedastic mean regression

被引:0
作者
Hermann, Philipp [1 ]
Holzmann, Hajo [1 ]
机构
[1] Philipps Univ Marburg, Dept Math & Comp Sci, Hans Meerweinstr 6, D-35043 Marburg, Germany
关键词
convergence rates; Huber loss function; knockoff filter; robust high-dimensional regression; sign recovery; support estimation; variable selection; SPARSITY RECOVERY; LASSO; CONSISTENCY; SYMMETRY;
D O I
10.1111/sjos.12772
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
A current strand of research in high-dimensional statistics deals with robustifying the methodology with respect to deviations from the pervasive light-tail assumptions. In this article, we consider a linear mean regression model with random design and potentially heteroscedastic, heavy-tailed errors, and investigate support estimation and sign recovery. We use a strictly convex, smooth variant of the Huber loss function with tuning parameters depending on the parameters of the problem, as well as the adaptive LASSO penalty for computational efficiency. For the resulting estimator, we show sign-consistency and optimal rates of convergence in the & ell;infinity$$ {\ell}_{\infty } $$ norm as in the homoscedastic, light-tailed setting. In our simulations, we also connect to the recent literature on variable selection with the thresholded LASSO and false discovery rate control using knockoffs and indicate the relevance of the Donoho-Tanner transition curve for variable selection. The simulations illustrate the favorable numerical performance of the proposed methodology.
引用
收藏
页数:35
相关论文
共 50 条
  • [31] VARIANCE ESTIMATION IN HIGH-DIMENSIONAL LINEAR REGRESSION VIA ADAPTIVE ELASTIC-NET
    Wang, Xin
    Kong, Lingchen
    Zhuang, Xinying
    Wang, Liqun
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2024, 20 (02) : 630 - 646
  • [32] Elementary Estimators for High-Dimensional Linear Regression
    Yang, Eunho
    Lozano, Aurelie C.
    Ravikumar, Pradeep
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 388 - 396
  • [33] High-dimensional predictive regression in the presence of cointegration
    Koo, Bonsoo
    Anderson, Heather M.
    Seo, Myung Hwan
    Yao, Wenying
    JOURNAL OF ECONOMETRICS, 2020, 219 (02) : 456 - 477
  • [34] Nonlocal Priors for High-Dimensional Estimation
    Rossell, David
    Telesca, Donatello
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2017, 112 (517) : 254 - 265
  • [35] EMPIRICAL LIKELIHOOD RATIO TESTS FOR COEFFICIENTS IN HIGH-DIMENSIONAL HETEROSCEDASTIC LINEAR MODELS
    Wang, Honglang
    Zhong, Ping-Shou
    Cui, Yuehua
    STATISTICA SINICA, 2018, 28 (04) : 2409 - 2433
  • [36] On the sign consistency of the Lasso for the high-dimensional Cox model
    Lv, Shaogao
    You, Mengying
    Lin, Huazhen
    Lian, Heng
    Huang, Jian
    JOURNAL OF MULTIVARIATE ANALYSIS, 2018, 167 : 79 - 96
  • [37] Change-point inference for high-dimensional heteroscedastic data
    Wu, Teng
    Volgushev, Stanislav
    Shao, Xiaofeng
    ELECTRONIC JOURNAL OF STATISTICS, 2023, 17 (02): : 3893 - 3941
  • [38] Regression-Based Network Estimation for High-Dimensional Genetic Data
    Lee, Kyu Min
    Lee, Minhyeok
    Seok, Junhee
    Han, Sung Won
    JOURNAL OF COMPUTATIONAL BIOLOGY, 2019, 26 (04) : 336 - 349
  • [39] Penalized estimation for competing risks regression with applications to high-dimensional covariates
    Ambrogi, Federico
    Scheike, Thomas H.
    BIOSTATISTICS, 2016, 17 (04) : 708 - 721
  • [40] HIGH-DIMENSIONAL GAUSSIAN COPULA REGRESSION: ADAPTIVE ESTIMATION AND STATISTICAL INFERENCE
    Cai, T. Tony
    Zhang, Linjun
    STATISTICA SINICA, 2018, 28 (02) : 963 - 993