DISTRIBUTED SPARSE COMPOSITE QUANTILE REGRESSION IN ULTRAHIGH DIMENSIONS

被引:3
|
作者
Chen, Canyi [1 ]
Gu, Yuwen [2 ]
Zou, Hui [3 ]
Zhu, Liping [4 ,5 ]
机构
[1] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[2] Univ Connecticut, Dept Stat, Storrs, CT 06269 USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
[4] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[5] Renmin Univ China, Ctr Appl Stat, Beijing 100872, Peoples R China
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Composite quantile regression; distributed estimation; ef-ficiency; heavy-tailed noise; support recovery; VARIABLE SELECTION; FRAMEWORK; EFFICIENT;
D O I
10.5705/ss.202022.0095
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We examine distributed estimation and support recovery for ultrahigh dimensional linear regression models under a potentially arbitrary noise distribution. The composite quantile regression is an efficient alternative to the least squares method, and provides robustness against heavy-tailed noise while maintaining reasonable efficiency in the case of light-tailed noise. The highly nonsmooth nature of the composite quantile regression loss poses challenges to both the theoretical and the computational development in an ultrahigh-dimensional distributed estimation setting. Thus, we cast the composite quantile regression into the least squares framework, and propose a distributed algorithm based on an approximate Newton method. This algorithm is efficient in terms of both computation and communication, and requires only gradient information to be communicated between the machines. We show that the resultant distributed estimator attains a near-oracle rate after a constant number of communications, and provide theoretical guarantees for its estimation and support recovery accuracy. Extensive experiments demonstrate the competitive empirical performance of our algorithm.
引用
收藏
页码:1143 / 1167
页数:25
相关论文
共 50 条
  • [31] Optimal portfolio selection using quantile and composite quantile regression models
    Aghamohammadi, A.
    Dadashi, H.
    Sojoudi, Mahdi
    Sojoudi, Meysam
    Tavoosi, M.
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2024, 53 (07) : 3047 - 3057
  • [32] Composite support vector quantile regression estimation
    Shim, Jooyong
    Hwang, Changha
    Seok, Kyungha
    COMPUTATIONAL STATISTICS, 2014, 29 (06) : 1651 - 1665
  • [33] Jackknife Model Averaging for Composite Quantile Regression
    You, Kang
    Wang, Miaomiao
    Zou, Guohua
    JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2024, 37 (04) : 1604 - 1637
  • [34] Bayesian composite quantile regression
    Huang, Hanwen
    Chen, Zhongxue
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2015, 85 (18) : 3744 - 3754
  • [35] Bootstrapping Composite Quantile Regression
    Seo, Kangmin
    Bang, Sungwan
    Jhun, Myoungshic
    KOREAN JOURNAL OF APPLIED STATISTICS, 2012, 25 (02) : 341 - 350
  • [36] Sparse wavelet estimation in quantile regression with multiple functional predictors
    Yu, Dengdeng
    Zhang, Li
    Mizera, Ivan
    Jiang, Bei
    Kong, Linglong
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2019, 136 : 12 - 29
  • [37] Sparse Convoluted Rank Regression in High Dimensions
    Zhou, Le
    Wang, Boxiang
    Zou, Hui
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (546) : 1500 - 1512
  • [38] Two step composite quantile regression for single-index models
    Jiang, Rong
    Zhou, Zhan-Gong
    Qian, Wei-Min
    Chen, Yong
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2013, 64 : 180 - 191
  • [39] Local Composite Quantile Regression for Regression Discontinuity
    Huang, Xiao
    Zhan, Zhaoguo
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2022, 40 (04) : 1863 - 1875
  • [40] Estimation and test procedures for composite quantile regression with covariates missing at random
    Ning, Zijun
    Tang, Linjun
    STATISTICS & PROBABILITY LETTERS, 2014, 95 : 15 - 25