High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion

被引:1
|
作者
Chen, Xin [1 ]
Deng, Chang [2 ]
He, Shuaida [1 ]
Wu, Runxiong [3 ]
Zhang, Jia [4 ]
机构
[1] Southern Univ Sci & Technol, Dept Stat & Data Sci, Shenzhen, Peoples R China
[2] Univ Chicago, Booth Sch Business, Chicago, IL USA
[3] Univ Calif Davis, Coll Engn, Davis, CA USA
[4] Southwestern Univ Finance & Econ, Joint Lab Data Sci & Business Intelligence, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Hilbert-Schmidt independence criterion; Single-index models; Large p small n; Majorization-minimization; Sufficient dimension reduction; Variable selection; SLICED INVERSE REGRESSION; ALTERNATING DIRECTION METHOD; SUFFICIENT DIMENSION; ADAPTIVE ESTIMATION; CENTRAL SUBSPACE; REDUCTION; MULTIPLIERS; RATES;
D O I
10.1007/s11222-024-10399-4
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hilbert-Schmidt Independence Criterion (HSIC) has recently been introduced to the field of single-index models to estimate the directions. Compared with other well-established methods, the HSIC based method requires relatively weak conditions. However, its performance has not yet been studied in the prevalent high-dimensional scenarios, where the number of covariates can be much larger than the sample size. In this article, based on HSIC, we propose to estimate the possibly sparse directions in the high-dimensional single-index models through a parameter reformulation. Our approach estimates the subspace of the direction directly and performs variable selection simultaneously. Due to the non-convexity of the objective function and the complexity of the constraints, a majorize-minimize algorithm together with the linearized alternating direction method of multipliers is developed to solve the optimization problem. Since it does not involve the inverse of the covariance matrix, the algorithm can naturally handle large p small n scenarios. Through extensive simulation studies and a real data analysis, we show that our proposal is efficient and effective in the high-dimensional settings. The Matlab\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\texttt {Matlab}$$\end{document} codes for this method are available online.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Ultra-High Dimensional Single-Index Quantile Regression
    Zhang, Yuankun
    Lian, Heng
    Yu, Yan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [22] Nonconcave penalized inverse regression in single-index models with high dimensional predictors
    Zhu, Li-Ping
    Zhu, Li-Xing
    JOURNAL OF MULTIVARIATE ANALYSIS, 2009, 100 (05) : 862 - 875
  • [23] Confidence intervals for high-dimensional partially linear single-index models
    Gueuning, Thomas
    Claeskens, Gerda
    JOURNAL OF MULTIVARIATE ANALYSIS, 2016, 149 : 13 - 29
  • [24] Robust Variable Selection Based on Penalized Composite Quantile Regression for High-Dimensional Single-Index Models
    Song, Yunquan
    Li, Zitong
    Fang, Minglu
    MATHEMATICS, 2022, 10 (12)
  • [25] Foreground extraction using Hilbert-Schmidt independence criterion and particle swarm optimization independent component analysis
    Mahdian Toroghi H.
    Mirzarezaee M.
    Najar Araabi B.
    International Journal of Engineering, Transactions B: Applications, 2020, 33 (05): : 1020 - 1026
  • [26] Dimensionality reduction for tensor data based on projection distance minimization and hilbert-schmidt independence criterion maximization
    Gan, Weichao
    Ma, Zhengming
    Liu, Shuyu
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 40 (05) : 10307 - 10322
  • [27] Single-index logistic model for high-dimensional group testing data
    Yang, Changfu
    Zhou, Wenxin
    Xiong, Wenjun
    Zhang, Junjian
    Ding, Juan
    AIMS MATHEMATICS, 2025, 10 (02): : 3523 - 3560
  • [28] A convex formulation for high-dimensional sparse sliced inverse regression
    Tan, Kean Ming
    Wang, Zhaoran
    Zhang, Tong
    Liu, Han
    Cook, R. Dennis
    BIOMETRIKA, 2018, 105 (04) : 769 - 782
  • [29] Non-convex penalized estimation in high-dimensional models with single-index structure
    Wang, Tao
    Xu, Pei-Rong
    Zhu, Li-Xing
    JOURNAL OF MULTIVARIATE ANALYSIS, 2012, 109 : 221 - 235
  • [30] Federated Sufficient Dimension Reduction Through High-Dimensional Sparse Sliced Inverse Regression
    Cui, Wenquan
    Zhao, Yue
    Xu, Jianjun
    Cheng, Haoyang
    COMMUNICATIONS IN MATHEMATICS AND STATISTICS, 2023, 13 (3) : 719 - 756