Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method

被引:0
|
作者
Li, Xin [1 ]
Wu, Dongya [2 ]
机构
[1] Northwest Univ, Sch Math, Xuefu Rd, Xian 710069, Shaanxi, Peoples R China
[2] Northwest Univ, Sch Informat Sci & Technol, Xuefu Rd, Xian 710069, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Sparse recovery; Errors-in-variables model; Covariate relaxation; Statistical consistency; MINIMAX RATES; SELECTION;
D O I
10.1007/s11222-023-10312-5
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Sparse signal recovery in high-dimensional settings via regularization techniques has been developed in the past two decades and produces fruitful results in various areas. Previous studies mainly focus on the idealized assumption where covariates are free of noise. However, in realistic scenarios, covariates are always corrupted by measurement errors, which may induce significant estimation bias when methods for clean data are naively applied. Recent studies begin to deal with the errors-in-variables models. Current method either depends on the distribution of covariate noise or does not depends on the distribution but is inconsistent in parameter estimation. A novel covariate relaxation method that does not depend on the distribution of covariate noise is proposed. Statistical consistency on parameter estimation is established. Numerical experiments are conducted and show that the covariate relaxation method achieves the same or even better estimation accuracy than that of the state of art nonconvex Lasso estimator. The advantage that the covariate relaxation method is independent of the distribution of covariate noise while produces a small estimation error suggests its prospect in practical applications.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Oracle Estimation of a Change Point in High-Dimensional Quantile Regression
    Lee, Sokbae
    Liao, Yuan
    Seo, Myung Hwan
    Shin, Youngki
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2018, 113 (523) : 1184 - 1194
  • [42] A Sparse Singular Value Decomposition Method for High-Dimensional Data
    Yang, Dan
    Ma, Zongming
    Buja, Andreas
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2014, 23 (04) : 923 - 942
  • [43] Low-Rank Matrix Recovery Via Nonconvex Optimization Methods with Application to Errors-in-Variables Matrix Regression
    Li, Xin
    Wu, Dongya
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2025, 205 (03)
  • [44] Significance testing in non-sparse high-dimensional linear models
    Zhu, Yinchu
    Bradic, Jelena
    ELECTRONIC JOURNAL OF STATISTICS, 2018, 12 (02): : 3312 - 3364
  • [45] Efficient Minimax Estimation of a Class of High-Dimensional Sparse Precision Matrices
    Chen, Xiaohui
    Kim, Young-Heon
    Wang, Z. Jane
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (06) : 2899 - 2912
  • [46] A General Theory of Concave Regularization for High-Dimensional Sparse Estimation Problems
    Zhang, Cun-Hui
    Zhang, Tong
    STATISTICAL SCIENCE, 2012, 27 (04) : 576 - 593
  • [47] l1-PENALIZED QUANTILE REGRESSION IN HIGH-DIMENSIONAL SPARSE MODELS
    Belloni, Alexandre
    Chernozhukov, Victor
    ANNALS OF STATISTICS, 2011, 39 (01) : 82 - 130
  • [48] Consistent Estimation of Generalized Linear Models with High Dimensional Predictors via Stepwise Regression
    Pijyan, Alex
    Zheng, Qi
    Hong, Hyokyoung G.
    Li, Yi
    ENTROPY, 2020, 22 (09)
  • [49] Ill-posed estimation in high-dimensional models with instrumental variables
    Breunig, Christoph
    Mammen, Enno
    Simoni, Anna
    JOURNAL OF ECONOMETRICS, 2020, 219 (01) : 171 - 200
  • [50] High-Dimensional Multivariate Linear Regression with Weighted Nuclear Norm Regularization
    Suh, Namjoon
    Lin, Li-Hsiang
    Huo, Xiaoming
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2024, 33 (04) : 1264 - 1275