Optimal post-selection inference for sparse signals: a nonparametric empirical Bayes approach

被引:4
|
作者
Woody, S. [1 ]
Padilla, O. H. M. [2 ]
Scott, J. G. [3 ]
机构
[1] Univ Texas Austin, Dept Integrat Biol, 2415 Speedway C0930, Austin, TX 78751 USA
[2] Univ Calif Los Angeles, Dept Stat, Los Angeles, CA 90095 USA
[3] Univ Texas Austin, Dept Informat Risk & Operat Management, 2110 Speedway B6500, Austin, TX 78751 USA
关键词
Biased test; Coverage; Post-selection inference; Selection bias; Shrinkage; FALSE DISCOVERY RATE; CONFIDENCE-INTERVALS;
D O I
10.1093/biomet/asab014
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Many recently developed Bayesian methods focus on sparse signal detection. However, much less work has been done on the natural follow-up question: how does one make valid inferences for the magnitude of those signals after selection? Ordinary Bayesian credible intervals suffer from selection bias, as do ordinary frequentist confidence intervals. Existing Bayesian methods for correcting this bias produce credible intervals with poor frequentist properties. Further, existing frequentist approaches require sacrificing the benefits of shrinkage typical in Bayesian methods, resulting in confidence intervals that are needlessly wide. We address this gap by proposing a nonparametric empirical Bayes approach to constructing optimal selection-adjusted confidence sets. Our method produces confidence sets that are as short as possible on average, while both adjusting for selection and maintaining exact frequentist coverage uniformly over the parameter space. We demonstrate an important consistency property of our procedure: under mild conditions, it asymptotically converges to the results of an oracle-Bayes analysis in which the prior distribution of signal sizes is known exactly. Across a series of examples, the method is found to outperform existing frequentist techniques for post-selection inference, producing confidence sets that are notably shorter, but with the same coverage guarantee.
引用
收藏
页码:1 / 16
页数:16
相关论文
共 33 条
  • [1] Post-Selection Inference
    Kuchibhotla, Arun K.
    Kolassa, John E.
    Kuffner, Todd A.
    ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, 2022, 9 : 505 - 527
  • [2] VALID POST-SELECTION INFERENCE
    Berk, Richard
    Brown, Lawrence
    Buja, Andreas
    Zhang, Kai
    Zhao, Linda
    ANNALS OF STATISTICS, 2013, 41 (02) : 802 - 837
  • [3] EXACT POST-SELECTION INFERENCE, WITH APPLICATION TO THE LASSO
    Lee, Jason D.
    Sun, Dennis L.
    Sun, Yuekai
    Taylor, Jonathan E.
    ANNALS OF STATISTICS, 2016, 44 (03) : 907 - 927
  • [4] On Post-selection Inference in A/B Testing
    Deng, Alex
    Li, Yicheng
    Lu, Jiannan
    Ramamurthy, Vivek
    KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2743 - 2752
  • [5] Splitting strategies for post-selection inference
    Rasines, D. Garcia
    Young, G. A.
    BIOMETRIKA, 2023, 110 (03) : 597 - 614
  • [6] Asymptotic post-selection inference for regularized graphical models
    Guglielmini, Sofia
    Claeskens, Gerda
    STATISTICS AND COMPUTING, 2025, 35 (02)
  • [7] Asymptotic post-selection inference for the Akaike information criterion
    Charkhi, Ali
    Claeskens, Gerda
    BIOMETRIKA, 2018, 105 (03) : 645 - 664
  • [8] POST-SELECTION INFERENCE VIA ALGORITHMIC STABILITY
    Zrnic, Tijana
    Jordan, Michael I.
    ANNALS OF STATISTICS, 2023, 51 (04) : 1666 - 1691
  • [9] Post-Selection Inference with HSIC-Lasso
    Freidling, Tobias
    Poignard, Benjamin
    Climente-Gonzalez, Hector
    Yamada, Makoto
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [10] Exact post-selection inference for adjusted R squared selection
    Pirenne, Sarah
    Claeskens, Gerda
    STATISTICS & PROBABILITY LETTERS, 2024, 211