High-dimensional sparse single–index regression via Hilbert–Schmidt independence criterion

被引:0
作者
Xin Chen
Chang Deng
Shuaida He
Runxiong Wu
Jia Zhang
机构
[1] Southern University of Science and Technology,Department of Statistics and Data Science
[2] University of Chicago,Booth School of Business
[3] University of California,College of Engineering
[4] Southwestern University of Finance and Economics,Joint Laboratory of Data Science and Business Intelligence
来源
Statistics and Computing | 2024年 / 34卷
关键词
Hilbert-Schmidt independence criterion; Single-index models; Large ; small ; Majorization-minimization; Sufficient dimension reduction; Variable selection;
D O I
暂无
中图分类号
学科分类号
摘要
Hilbert-Schmidt Independence Criterion (HSIC) has recently been introduced to the field of single-index models to estimate the directions. Compared with other well-established methods, the HSIC based method requires relatively weak conditions. However, its performance has not yet been studied in the prevalent high-dimensional scenarios, where the number of covariates can be much larger than the sample size. In this article, based on HSIC, we propose to estimate the possibly sparse directions in the high-dimensional single-index models through a parameter reformulation. Our approach estimates the subspace of the direction directly and performs variable selection simultaneously. Due to the non-convexity of the objective function and the complexity of the constraints, a majorize-minimize algorithm together with the linearized alternating direction method of multipliers is developed to solve the optimization problem. Since it does not involve the inverse of the covariance matrix, the algorithm can naturally handle large p small n scenarios. Through extensive simulation studies and a real data analysis, we show that our proposal is efficient and effective in the high-dimensional settings. The Matlab\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\texttt {Matlab}$$\end{document} codes for this method are available online.
引用
收藏
相关论文
共 123 条
  • [1] Chen X(2018)Efficient sparse estimate of sufficient dimension reduction in high dimension Technometrics 60 161-168
  • [2] Sheng W(2010)Coordinate-independent sparse sufficient dimension reduction and variable selection Ann. Stat. 38 3696-3723
  • [3] Yin X(1994)On the interpretation of regression plots J. Am. Stat. Assoc. 89 177-189
  • [4] Chen X(1996)Graphics for regressions with a binary response J. Am. Stat. Assoc. 91 983-992
  • [5] Zou C(2004)Testing predictor contributions in sufficient dimension reduction Ann. Stat. 32 1062-1092
  • [6] Cook R(2008)Principal fitted components for dimension reduction in regression Stat. Sci. 23 485-501
  • [7] Cook R(2009)Likelihood-based sufficient dimension reduction J. Am. Stat. Assoc. 104 197-208
  • [8] Cook R(2005)Sufficient dimension reduction via inverse regression: a minimum discrepancy approach J. Am. Stat. Assoc. 100 410-428
  • [9] Cook R(1991)Sliced inverse regression for dimension reduction: comment J. Am. Stat. Assoc. 86 328-332
  • [10] Cook R(2015)High-dimensional inference: confidence intervals Stat. Sci. 30 533-558