High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion

被引:1
|
作者
Chen, Xin [1 ]
Deng, Chang [2 ]
He, Shuaida [1 ]
Wu, Runxiong [3 ]
Zhang, Jia [4 ]
机构
[1] Southern Univ Sci & Technol, Dept Stat & Data Sci, Shenzhen, Peoples R China
[2] Univ Chicago, Booth Sch Business, Chicago, IL USA
[3] Univ Calif Davis, Coll Engn, Davis, CA USA
[4] Southwestern Univ Finance & Econ, Joint Lab Data Sci & Business Intelligence, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Hilbert-Schmidt independence criterion; Single-index models; Large p small n; Majorization-minimization; Sufficient dimension reduction; Variable selection; SLICED INVERSE REGRESSION; ALTERNATING DIRECTION METHOD; SUFFICIENT DIMENSION; ADAPTIVE ESTIMATION; CENTRAL SUBSPACE; REDUCTION; MULTIPLIERS; RATES;
D O I
10.1007/s11222-024-10399-4
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hilbert-Schmidt Independence Criterion (HSIC) has recently been introduced to the field of single-index models to estimate the directions. Compared with other well-established methods, the HSIC based method requires relatively weak conditions. However, its performance has not yet been studied in the prevalent high-dimensional scenarios, where the number of covariates can be much larger than the sample size. In this article, based on HSIC, we propose to estimate the possibly sparse directions in the high-dimensional single-index models through a parameter reformulation. Our approach estimates the subspace of the direction directly and performs variable selection simultaneously. Due to the non-convexity of the objective function and the complexity of the constraints, a majorize-minimize algorithm together with the linearized alternating direction method of multipliers is developed to solve the optimization problem. Since it does not involve the inverse of the covariance matrix, the algorithm can naturally handle large p small n scenarios. Through extensive simulation studies and a real data analysis, we show that our proposal is efficient and effective in the high-dimensional settings. The Matlab\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\texttt {Matlab}$$\end{document} codes for this method are available online.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Robust Information Criterion for Model Selection in Sparse High-Dimensional Linear Regression Models
    Gohain, Prakash Borpatra
    Jansson, Magnus
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 2251 - 2266
  • [32] A dimension reduction based approach for estimation and variable selection in partially linear single-index models with high-dimensional covariates
    Zhang, Jun
    Wang, Tao
    Zhu, Lixing
    Liang, Hua
    ELECTRONIC JOURNAL OF STATISTICS, 2012, 6 : 2235 - 2273
  • [33] Single-index composite quantile regression for ultra-high-dimensional data
    Jiang, Rong
    Sun, Mengxian
    TEST, 2022, 31 (02) : 443 - 460
  • [34] Multi-label Feature Selection Method Combining Unbiased Hilbert-Schmidt Independence Criterion with Controlled Genetic Algorithm
    Liu, Chang
    Ma, Quan
    Xu, Jianhua
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV, 2018, 11304 : 3 - 14
  • [35] Empirical likelihood in single-index quantile regression with high dimensional and missing observations
    Wang, Bao-Hua
    Liang, Han-Ying
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2023, 226 : 1 - 19
  • [36] High-Dimensional Classification by Sparse Logistic Regression
    Abramovich, Felix
    Grinshtein, Vadim
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (05) : 3068 - 3079
  • [37] High-Dimensional Sparse Additive Hazards Regression
    Lin, Wei
    Lv, Jinchi
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2013, 108 (501) : 247 - 264
  • [38] A new variable selection and estimation algorithm for the high-dimensional quantile single-index model
    Wang, Xuhui
    Ye, Wuyi
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2024,
  • [39] Efficient sparse portfolios based on composite quantile regression for high-dimensional index tracking
    Li, Ning
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2020, 90 (08) : 1466 - 1478
  • [40] Single-index modal regression via outer product gradients
    Yang, Jing
    Tian, Guoliang
    Lu, Fang
    Lu, Xuewen
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2020, 144