SPReM: Sparse Projection Regression Model For High-Dimensional Linear Regression

被引:11
|
作者
Sun, Qiang [1 ]
Zhu, Hongtu [2 ]
Liu, Yufeng [3 ]
Ibrahim, Joseph G. [2 ]
机构
[1] Univ N Carolina, Dept Biostat, Chapel Hill, NC 27599 USA
[2] Univ N Carolina, Dept Biostat, Biostat, Chapel Hill, NC 27599 USA
[3] Univ N Carolina, Dept Stat & Operat Res, Stat, Chapel Hill, NC 27599 USA
基金
加拿大健康研究院; 美国国家卫生研究院; 美国国家科学基金会;
关键词
Heritability ratio; Imaging genetics; Multivariate regression; Projection regression; Sparse; Wild bootstrap; PRINCIPAL-COMPONENTS; BRAIN-DEVELOPMENT; MULTIVARIATE; CLASSIFICATION; FMRI; HERITABILITY; CONVERGENCE; RESPONSES; SELECTION; GENETICS;
D O I
10.1080/01621459.2014.892008
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The aim of this article is to develop a sparse projection regression modeling (SPReM) framework to perform multivariate regression modeling with a large number of responses and a multivariate covariate of interest. We propose two novel heritability ratios to simultaneously perform dimension reduction, response selection, estimation, and testing, while explicitly accounting for correlations among multivariate responses. Our SPReM is devised to specifically address the low statistical power issue of many standard statistical approaches, such as the Hotelling's T-2 test statistic or a mass univariate analysis, for high-dimensional data. We formulate the estimation problem of SPReM as a novel sparse unit rank projection (SURP) problem and propose a fast optimization algorithm for SURP. Furthermore, we extend SURP to the sparse multirank projection (SMURP) by adopting a sequential SURP approximation. Theoretically, we have systematically investigated the convergence properties of SURP and the convergence rate of SURP estimates. Our simulation results and real data analysis have shown that SPReM outperforms other state-of-the-art methods.
引用
收藏
页码:289 / 302
页数:14
相关论文
共 50 条
  • [1] The likelihood ratio test for high-dimensional linear regression model
    Xie, Junshan
    Xiao, Nannan
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (17) : 8479 - 8492
  • [2] Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model
    Guo, Wenxing
    Balakrishnan, Narayanaswamy
    Bian, Mengjie
    ELECTRONIC JOURNAL OF STATISTICS, 2021, 15 (02): : 4167 - 4191
  • [3] Projection tests for regression coefficients in high-dimensional partial linear models
    Li, Mengyao
    Zhang, Jiangshe
    Zhang, Jun
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2024,
  • [4] Robust and sparse estimation methods for high-dimensional linear and logistic regression
    Kurnaz, Fatma Sevinc
    Hoffmann, Irene
    Filzmoser, Peter
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2018, 172 : 211 - 222
  • [5] Benign overfitting of non-sparse high-dimensional linear regression with correlated noise
    Tsuda, Toshiki
    Imaizumi, Masaaki
    ELECTRONIC JOURNAL OF STATISTICS, 2024, 18 (02): : 4119 - 4197
  • [6] Efficient sparse high-dimensional linear regression with a partitioned empirical Bayes ECM algorithm
    Mclain, Alexander C.
    Zgodic, Anja
    Bondell, Howard
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2025, 207
  • [7] Multivariate linear regression of high-dimensional fMRI data with multiple target variables
    Valente, Giancarlo
    Castellanos, Agustin Lage
    Vanacore, Gianluca
    Formisano, Elia
    HUMAN BRAIN MAPPING, 2014, 35 (05) : 2163 - 2177
  • [8] Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage
    Zhou, Kun
    Li, Ker-Chau
    Zhou, Qing
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (541) : 469 - 488
  • [9] Robust Estimation of High-Dimensional Linear Regression With Changepoints
    Cui, Xiaolong
    Geng, Haoyu
    Wang, Zhaojun
    Zou, Changliang
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (10) : 7297 - 7319
  • [10] Robust linear regression for high-dimensional data: An overview
    Filzmoser, Peter
    Nordhausen, Klaus
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2021, 13 (04)