A convex formulation for high-dimensional sparse sliced inverse regression

被引:19
|
作者
Tan, Kean Ming [1 ]
Wang, Zhaoran [2 ]
Zhang, Tong [3 ]
Liu, Han [3 ]
Cook, R. Dennis [1 ]
机构
[1] Univ Minnesota, Sch Stat, 313 Ford Hall,224 Church St SE, Minneapolis, MN 55455 USA
[2] Northwestern Univ, Ind Engn & Management Sci, 2145 Sheridan Rd, Evanston, IL 60208 USA
[3] Tencent Technol, Tencent AI Lab, Netac Bldg,High Tech 6th South Rd, Shenzhen, Peoples R China
基金
美国国家科学基金会;
关键词
Convex optimization; Dimension reduction; Nonparametric regression; Principal fitted component; ALTERNATING DIRECTION METHOD; STRUCTURAL DIMENSION; REDUCTION; ASYMPTOTICS; MULTIPLIERS;
D O I
10.1093/biomet/asy049
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Sliced inverse regression is a popular tool for sufficient dimension reduction, which replaces covariates with a minimal set of their linear combinations without loss of information on the conditional distribution of the response given the covariates. The estimated linear combinations include all covariates, making results difficult to interpret and perhaps unnecessarily variable, particularly when the number of covariates is large. In this paper, we propose a convex formulation for fitting sparse sliced inverse regression in high dimensions. Our proposal estimates the subspace of the linear combinations of the covariates directly and performs variable selection simultaneously. We solve the resulting convex optimization problem via the linearized alternating direction methods of multiplier algorithm, and establish an upper bound on the subspace distance between the estimated and the true subspaces. Through numerical studies, we show that our proposal is able to identify the correct covariates in the high-dimensional setting.
引用
收藏
页码:769 / 782
页数:14
相关论文
共 50 条
  • [41] SPARSE HIGH-DIMENSIONAL REGRESSION: EXACT SCALABLE ALGORITHMS AND PHASE TRANSITIONS
    Bertsimas, Dimitris
    Van Parys, Bart
    ANNALS OF STATISTICS, 2020, 48 (01): : 300 - 323
  • [42] A minimax optimal approach to high-dimensional double sparse linear regression
    Zhang, Yanhang
    Li, Zhifan
    Liu, Shixiang
    Yin, Jianxin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 66
  • [43] Minimax Sparse Logistic Regression for Very High-Dimensional Feature Selection
    Tan, Mingkui
    Tsang, Ivor W.
    Wang, Li
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (10) : 1609 - 1622
  • [44] Variable selection in high-dimensional sparse multiresponse linear regression models
    Shan Luo
    Statistical Papers, 2020, 61 : 1245 - 1267
  • [45] Sliced inverse regression-based sparse polynomial chaos expansions for reliability analysis in high dimensions
    Pan, Qiujing
    Dias, Daniel
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2017, 167 : 484 - 493
  • [46] High-dimensional sparse MANOVA
    Cai, T. Tony
    Xia, Yin
    JOURNAL OF MULTIVARIATE ANALYSIS, 2014, 131 : 174 - 196
  • [47] ON CONSISTENCY AND SPARSITY FOR SLICED INVERSE REGRESSION IN HIGH DIMENSIONS
    Lin, Qian
    Zhao, Zhigen
    Liu, Jun S.
    ANNALS OF STATISTICS, 2018, 46 (02): : 580 - 610
  • [48] Sparse Estimation of High-Dimensional Inverse Covariance Matrices with Explicit Eigenvalue Constraints
    Yun-Hai Xiao
    Pei-Li Li
    Sha Lu
    Journal of the Operations Research Society of China, 2021, 9 : 543 - 568
  • [49] Collaborative sliced inverse regression
    Chiancone, Alessandro
    Girard, Stephane
    Chanussot, Jocelyn
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (12) : 6035 - 6053
  • [50] Sparse Estimation of High-Dimensional Inverse Covariance Matrices with Explicit Eigenvalue Constraints
    Xiao, Yun-Hai
    Li, Pei-Li
    Lu, Sha
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2021, 9 (03) : 543 - 568