Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage

被引:0
|
作者
Zhou, Kun [1 ]
Li, Ker-Chau [1 ,2 ]
Zhou, Qing [1 ]
机构
[1] Univ Calif Los Angeles, Dept Stat, Los Angeles, CA 90095 USA
[2] Acad Sinica, Inst Stat Sci, Nangang, Taiwan
基金
美国国家科学基金会;
关键词
Adaptive confidence set; High-dimensional inference; Sparse linear regression; Stein estimate; SIMULTANEOUS INFERENCE; INTERVALS; LASSO; ESTIMATORS; SELECTION; REGIONS; RATES;
D O I
10.1080/01621459.2021.1938581
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The issue of honesty in constructing confidence sets arises in nonparametric regression. While optimal rate in nonparametric estimation can be achieved and utilized to construct sharp confidence sets, severe degradation of confidence level often happens after estimating the degree of smoothness. Similarly, for high-dimensional regression, oracle inequalities for sparse estimators could be utilized to construct sharp confidence sets. Yet, the degree of sparsity itself is unknown and needs to be estimated, which causes the honesty problem. To resolve this issue, we develop a novel method to construct honest confidence sets for sparse high-dimensional linear regression. The key idea in our construction is to separate signals into a strong and a weak group, and then construct confidence sets for each group separately. This is achieved by a projection and shrinkage approach, the latter implemented via Stein estimation and the associated Stein unbiased risk estimate. Our confidence set is honest over the full parameter space without any sparsity constraints, while its size adapts to the optimal rate of n(-1/4) when the true parameter is indeed sparse. Moreover, under some form of a separation assumption between the strong and weak signals, the diameter of our confidence set can achieve a faster rate than existing methods. Through extensive numerical comparisons on both simulated and real data, we demonstrate that our method outperforms other competitors with bigmargins for finite samples, including oracle methods built upon the true sparsity of the underlying model.
引用
收藏
页码:469 / 488
页数:20
相关论文
共 50 条
  • [41] CONFIDENCE INTERVALS FOR HIGH-DIMENSIONAL COX MODELS
    Yu, Yi
    Bradic, Jelena
    Samworth, Richard J.
    STATISTICA SINICA, 2021, 31 (01) : 243 - 267
  • [42] Sparse High-Dimensional Isotonic Regression
    Gamarnik, David
    Gaudio, Julia
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [43] Testing covariates in high-dimensional regression
    Lan, Wei
    Wang, Hansheng
    Tsai, Chih-Ling
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2014, 66 (02) : 279 - 301
  • [44] High-Dimensional Constrained Huber Regression
    Wei, Quan
    Zhao, Ziping
    2024 IEEE 13RD SENSOR ARRAY AND MULTICHANNEL SIGNAL PROCESSING WORKSHOP, SAM 2024, 2024,
  • [45] Testing covariates in high-dimensional regression
    Wei Lan
    Hansheng Wang
    Chih-Ling Tsai
    Annals of the Institute of Statistical Mathematics, 2014, 66 : 279 - 301
  • [46] Localized Lasso for High-Dimensional Regression
    Yamada, Makoto
    Takeuchi, Koh
    Iwata, Tomoharu
    Shawe-Taylor, John
    Kaski, Samuel
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 325 - 333
  • [47] Nonlinear confounding in high-dimensional regression
    Li, KC
    ANNALS OF STATISTICS, 1997, 25 (02): : 577 - 612
  • [48] HONEST CONFIDENCE-REGIONS FOR NONPARAMETRIC REGRESSION
    LI, KC
    ANNALS OF STATISTICS, 1989, 17 (03): : 1001 - 1008
  • [49] High-dimensional Quantile Tensor Regression
    Lu, Wenqi
    Zhu, Zhongyi
    Lian, Heng
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [50] High-dimensional quantile tensor regression
    Lu, Wenqi
    Zhu, Zhongyi
    Lian, Heng
    Journal of Machine Learning Research, 2020, 21