Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method

被引:0
|
作者
Li, Xin [1 ]
Wu, Dongya [2 ]
机构
[1] Northwest Univ, Sch Math, Xuefu Rd, Xian 710069, Shaanxi, Peoples R China
[2] Northwest Univ, Sch Informat Sci & Technol, Xuefu Rd, Xian 710069, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Sparse recovery; Errors-in-variables model; Covariate relaxation; Statistical consistency; MINIMAX RATES; SELECTION;
D O I
10.1007/s11222-023-10312-5
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Sparse signal recovery in high-dimensional settings via regularization techniques has been developed in the past two decades and produces fruitful results in various areas. Previous studies mainly focus on the idealized assumption where covariates are free of noise. However, in realistic scenarios, covariates are always corrupted by measurement errors, which may induce significant estimation bias when methods for clean data are naively applied. Recent studies begin to deal with the errors-in-variables models. Current method either depends on the distribution of covariate noise or does not depends on the distribution but is inconsistent in parameter estimation. A novel covariate relaxation method that does not depend on the distribution of covariate noise is proposed. Statistical consistency on parameter estimation is established. Numerical experiments are conducted and show that the covariate relaxation method achieves the same or even better estimation accuracy than that of the state of art nonconvex Lasso estimator. The advantage that the covariate relaxation method is independent of the distribution of covariate noise while produces a small estimation error suggests its prospect in practical applications.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] A global homogeneity test for high-dimensional linear regression
    Charbonnier, Camille
    Verzelen, Nicolas
    Villers, Fanny
    ELECTRONIC JOURNAL OF STATISTICS, 2015, 9 (01): : 318 - 382
  • [32] Tests for high-dimensional partially linear regression models
    Shi, Hongwei
    Yang, Weichao
    Sun, Bowen
    Guo, Xu
    STATISTICAL PAPERS, 2025, 66 (03)
  • [33] High-Dimensional Learning of Linear Causal Networks via Inverse Covariance Estimation
    Loh, Po-Ling
    Buehlmann, Peter
    JOURNAL OF MACHINE LEARNING RESEARCH, 2014, 15 : 3065 - 3105
  • [34] g.ridge: An R Package for Generalized Ridge Regression for Sparse and High-Dimensional Linear Models
    Emura, Takeshi
    Matsumoto, Koutarou
    Uozumi, Ryuji
    Michimae, Hirofumi
    SYMMETRY-BASEL, 2024, 16 (02):
  • [35] Thompson Sampling for High-Dimensional Sparse Linear Contextual Bandits
    Chakraborty, Sunrit
    Roy, Saptarshi
    Tewari, Ambuj
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [36] SPARSE MEDIAN GRAPHS ESTIMATION IN A HIGH-DIMENSIONAL SEMIPARAMETRIC MODEL
    Han, Fang
    Han, Xiaoyan
    Liu, Han
    Caffo, Brian
    ANNALS OF APPLIED STATISTICS, 2016, 10 (03) : 1397 - 1426
  • [37] Communication-efficient distributed estimation for high-dimensional large-scale linear regression
    Liu, Zhan
    Zhao, Xiaoluo
    Pan, Yingli
    METRIKA, 2023, 86 (04) : 455 - 485
  • [38] The likelihood ratio test for high-dimensional linear regression model
    Xie, Junshan
    Xiao, Nannan
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (17) : 8479 - 8492
  • [39] DOUBLY PENALIZED ESTIMATION IN ADDITIVE REGRESSION WITH HIGH-DIMENSIONAL DATA
    Tan, Zhiqiang
    Zhang, Cun-Hui
    ANNALS OF STATISTICS, 2019, 47 (05) : 2567 - 2600
  • [40] A joint estimation for the high-dimensional regression modeling on stratified data
    Gao, Yimiao
    Yang, Yuehan
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2023, 52 (12) : 6129 - 6140