High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion

被引:1
|
作者
Chen, Xin [1 ]
Deng, Chang [2 ]
He, Shuaida [1 ]
Wu, Runxiong [3 ]
Zhang, Jia [4 ]
机构
[1] Southern Univ Sci & Technol, Dept Stat & Data Sci, Shenzhen, Peoples R China
[2] Univ Chicago, Booth Sch Business, Chicago, IL USA
[3] Univ Calif Davis, Coll Engn, Davis, CA USA
[4] Southwestern Univ Finance & Econ, Joint Lab Data Sci & Business Intelligence, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Hilbert-Schmidt independence criterion; Single-index models; Large p small n; Majorization-minimization; Sufficient dimension reduction; Variable selection; SLICED INVERSE REGRESSION; ALTERNATING DIRECTION METHOD; SUFFICIENT DIMENSION; ADAPTIVE ESTIMATION; CENTRAL SUBSPACE; REDUCTION; MULTIPLIERS; RATES;
D O I
10.1007/s11222-024-10399-4
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hilbert-Schmidt Independence Criterion (HSIC) has recently been introduced to the field of single-index models to estimate the directions. Compared with other well-established methods, the HSIC based method requires relatively weak conditions. However, its performance has not yet been studied in the prevalent high-dimensional scenarios, where the number of covariates can be much larger than the sample size. In this article, based on HSIC, we propose to estimate the possibly sparse directions in the high-dimensional single-index models through a parameter reformulation. Our approach estimates the subspace of the direction directly and performs variable selection simultaneously. Due to the non-convexity of the objective function and the complexity of the constraints, a majorize-minimize algorithm together with the linearized alternating direction method of multipliers is developed to solve the optimization problem. Since it does not involve the inverse of the covariance matrix, the algorithm can naturally handle large p small n scenarios. Through extensive simulation studies and a real data analysis, we show that our proposal is efficient and effective in the high-dimensional settings. The Matlab\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\texttt {Matlab}$$\end{document} codes for this method are available online.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Sparse Hilbert Schmidt Independence Criterion and Surrogate-Kernel-Based Feature Selection for Hyperspectral Image Classification
    Damodaran, Bharath Bhushan
    Courty, Nicolas
    Lefevre, Sebastien
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2017, 55 (04): : 2385 - 2398
  • [42] ADAPTIVE LASSO FOR SPARSE HIGH-DIMENSIONAL REGRESSION MODELS
    Huang, Jian
    Ma, Shuangge
    Zhang, Cun-Hui
    STATISTICA SINICA, 2008, 18 (04) : 1603 - 1618
  • [43] THE SPARSE LAPLACIAN SHRINKAGE ESTIMATOR FOR HIGH-DIMENSIONAL REGRESSION
    Huang, Jian
    Ma, Shuangge
    Li, Hongzhe
    Zhang, Cun-Hui
    ANNALS OF STATISTICS, 2011, 39 (04) : 2021 - 2046
  • [44] ADMM for High-Dimensional Sparse Penalized Quantile Regression
    Gu, Yuwen
    Fan, Jun
    Kong, Lingchen
    Ma, Shiqian
    Zou, Hui
    TECHNOMETRICS, 2018, 60 (03) : 319 - 331
  • [45] NEARLY OPTIMAL MINIMAX ESTIMATOR FOR HIGH-DIMENSIONAL SPARSE LINEAR REGRESSION
    Zhang, Li
    ANNALS OF STATISTICS, 2013, 41 (04) : 2149 - 2175
  • [46] High-Dimensional Covariance Decomposition into Sparse Markov and Independence Models
    Janzamin, Majid
    Anandkumar, Animashree
    JOURNAL OF MACHINE LEARNING RESEARCH, 2014, 15 : 1549 - 1591
  • [47] Rates of convergence of the constrained least squares estimator in high-dimensional monotone single-index models
    Fragneau, Christopher
    Balabdaoui, Fadoua
    Durot, Cecile
    Stefan, Skander
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2025, 54 (04) : 1180 - 1204
  • [48] Online sparse sliced inverse regression for high-dimensional streaming data
    Xu, Jianjun
    Cui, Wenquan
    Cheng, Haoyang
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2023, 21 (02)
  • [49] Sparse Portfolios for High-Dimensional Financial Index Tracking
    Benidis, Konstantinos
    Feng, Yiyong
    Palomar, Daniel P.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (01) : 155 - 170
  • [50] High-dimensional index volatility models via Stein's identity
    Na, Sen
    Kolar, Mladen
    BERNOULLI, 2021, 27 (02) : 794 - 817