Sparse Convoluted Rank Regression in High Dimensions

被引:8
|
作者
Zhou, Le [1 ]
Wang, Boxiang [2 ]
Zou, Hui [3 ]
机构
[1] Hong Kong Baptist Univ, Dept Math, Kowloon Tong, Hong Kong, Peoples R China
[2] Univ Iowa, Dept Stat & Actuarial Sci, Iowa City, IA USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
关键词
Convolution; Efficiency; High dimensions; Information criterion; Rank regression; NONCONCAVE PENALIZED LIKELIHOOD; QUANTILE REGRESSION; VARIABLE SELECTION;
D O I
10.1080/01621459.2023.2202433
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Wang et al. studied the high-dimensional sparse penalized rank regression and established its nice theoretical properties. Compared with the least squares, rank regression can have a substantial gain in estimation efficiency while maintaining a minimal relative efficiency of 86.4%. However, the computation of penalized rank regression can be very challenging for high-dimensional data, due to the highly nonsmooth rank regression loss. In this work we view the rank regression loss as a nonsmooth empirical counterpart of a population level quantity, and a smooth empirical counterpart is derived by substituting a kernel density estimator for the true distribution in the expectation calculation. This view leads to the convoluted rank regression loss and consequently the sparse penalized convoluted rank regression (CRR) for high-dimensional data. We prove some interesting asymptotic properties of CRR. Under the same key assumptions for sparse rank regression, we establish the rate of convergence of the l(1)-penalized CRR for a tuning free penalization parameter and prove the strong oracle property of the folded concave penalized CRR. We further propose a high-dimensional Bayesian information criterion for selecting the penalization parameter in folded concave penalized CRR and prove its selection consistency. We derive an efficient algorithm for solving sparse convoluted rank regression that scales well with high dimensions. Numerical examples demonstrate the promising performance of the sparse convoluted rank regression over the sparse rank regression. Our theoretical and numerical results suggest that sparse convoluted rank regression enjoys the best of both sparse least squares regression and sparse rank regression. for this article are available online.
引用
收藏
页码:1500 / 1512
页数:13
相关论文
共 50 条
  • [31] Consistent tuning parameter selection in high dimensional sparse linear regression
    Wang, Tao
    Zhu, Lixing
    JOURNAL OF MULTIVARIATE ANALYSIS, 2011, 102 (07) : 1141 - 1151
  • [32] Incorporating Graphical Structure of Predictors in Sparse Quantile Regression
    Wang, Zhanfeng
    Liu, Xianhui
    Tang, Wenlu
    Lin, Yuanyuan
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2021, 39 (03) : 783 - 792
  • [33] Smoothed quantile regression for partially functional linear models in high dimensions
    Wang, Zhihao
    Bai, Yongxin
    Haerdle, Wolfgang K.
    Tian, Maozai
    BIOMETRICAL JOURNAL, 2023, 65 (07)
  • [34] Robust and sparse bridge regression
    Li, Bin
    Yu, Qingzhao
    STATISTICS AND ITS INTERFACE, 2009, 2 (04) : 481 - 491
  • [35] Scaled sparse linear regression
    Sun, Tingni
    Zhang, Cun-Hui
    BIOMETRIKA, 2012, 99 (04) : 879 - 898
  • [36] Marginalized lasso in sparse regression
    Lee, Seokho
    Kim, Seonhwa
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2019, 48 (03) : 396 - 411
  • [37] Additive monotone regression in high and lower dimensions
    Engebretsen, Solveig
    Glad, Ingrid K.
    STATISTICS SURVEYS, 2019, 13 : 1 - 51
  • [38] Partially functional linear regression in high dimensions
    Kong, Dehan
    Xue, Kaijie
    Yao, Fang
    Zhang, Hao H.
    BIOMETRIKA, 2016, 103 (01) : 147 - 159
  • [39] Nonconvex Penalty Based Low-Rank Representation and Sparse Regression for eQTL Mapping
    Yuan, Lin
    Zhu, Lin
    Guo, Wei-Li
    Zhou, Xiaobo
    Zhang, Youhua
    Huang, Zhenhua
    Huang, De-Shuang
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2017, 14 (05) : 1154 - 1164
  • [40] Envelope-based sparse reduced-rank regression for multivariate linear model
    Guo, Wenxing
    Balakrishnan, Narayanaswamy
    He, Mu
    JOURNAL OF MULTIVARIATE ANALYSIS, 2023, 195