Composite smoothed quantile regression

被引:2
|
作者
Yan, Yibo [1 ]
Wang, Xiaozhou [1 ,2 ]
Zhang, Riquan [3 ]
机构
[1] East China Normal Univ, Sch Stat, 3663 North Zhongshan Rd, Shanghai 200062, Peoples R China
[2] East China Normal Univ, Key Lab Adv Theory & Applicat Stat & Data Sci MOE, Shanghai, Peoples R China
[3] Shanghai Univ Int Business & Econ, Sch Stat & Informat, 1900 Wenxiang Rd, Shanghai 201620, Peoples R China
来源
STAT | 2023年 / 12卷 / 01期
基金
中国国家自然科学基金;
关键词
asymptotic relative efficiency; Bahadur representation; composite quantile regression; convolution-type smoothing; gradient descent; non-asymptotic statistics; VARIABLE SELECTION; EFFICIENT;
D O I
10.1002/sta4.542
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Composite quantile regression (CQR) is an efficient method to estimate parameters of the linear model with non-Gaussian random noise. The non-smoothness of CQR loss prevents many efficient algorithms from being used. In this paper, we propose the composite smoothed quantile regression (CSQR) model and investigate the inference problem for a large-scale dataset, in which the dimensionality p$$ p $$ is allowed to increase with the sample size n$$ n $$ while p/n is an element of(0,1)$$ p/n\in \left(0,1\right) $$. After applying the convolution smoothing technique to the composite quantile loss, we obtain the convex and twice differentiable CSQR loss function, which can be optimized via the gradient descent algorithm. Theoretically, we establish the non-asymptotic error bound for the CSQR estimators and further provide the Bahadur representation and the Berry-Esseen bound, from which the asymptotic normality of CSQR estimator can be immediately derived. To make valid inference, we construct the confidence intervals that based on the asymptotic distribution. Besides, we also explore the asymptotic relative efficiency of the CSQR estimator with respect to the standard CQR estimator. At last, we provide extensive numerical experiments on both simulated and real data to demonstrate the good performance of our CSQR estimator compared with some baselines.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Multi-round smoothed composite quantile regression for distributed data
    Di, Fengrui
    Wang, Lei
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2022, 74 (05) : 869 - 893
  • [2] Multi-round smoothed composite quantile regression for distributed data
    Fengrui Di
    Lei Wang
    Annals of the Institute of Statistical Mathematics, 2022, 74 : 869 - 893
  • [3] Smoothed quantile regression for panel data
    Galvao, Antonio F.
    Kato, Kengo
    JOURNAL OF ECONOMETRICS, 2016, 193 (01) : 92 - 112
  • [4] Smoothed quantile regression with nonignorable dropouts
    Ma, Wei
    Wang, Lei
    ANALYSIS AND APPLICATIONS, 2022, 20 (05) : 859 - 894
  • [5] Smoothed instrumental variables quantile regression
    Kaplan, David M.
    STATA JOURNAL, 2022, 22 (02): : 379 - 403
  • [6] Smoothed quantile regression for censored residual life
    Kyu Hyun Kim
    Daniel J. Caplan
    Sangwook Kang
    Computational Statistics, 2023, 38 : 1001 - 1022
  • [7] Smoothed quantile regression for censored residual life
    Kim, Kyu Hyun
    Caplan, Daniel J.
    Kang, Sangwook
    COMPUTATIONAL STATISTICS, 2023, 38 (02) : 1001 - 1022
  • [8] Smoothed quantile regression analysis of competing risks
    Choi, Sangbum
    Kang, Sangwook
    Huang, Xuelin
    BIOMETRICAL JOURNAL, 2018, 60 (05) : 934 - 946
  • [9] Nonparametric inference on smoothed quantile regression process
    Hao, Meiling
    Lin, Yuanyuan
    Shen, Guohao
    Su, Wen
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2023, 179
  • [10] A Unified Algorithm for Penalized Convolution Smoothed Quantile Regression
    Man, Rebeka
    Pan, Xiaoou
    Tan, Kean Ming
    Zhou, Wen-Xin
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2024, 33 (02) : 625 - 637