Robust sparse regression by modeling noise as a mixture of gaussians

被引:4
|
作者
Xu, Shuang [1 ]
Zhang, Chun-Xia [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Robust regression; penalized regression; variable selection; mixture of Gaussians; lasso; VARIABLE SELECTION; REGULARIZATION; SHRINKAGE; ALGORITHM;
D O I
10.1080/02664763.2019.1566448
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Regression analysis has been proven to be a quite effective tool in a large variety of fields. In many regression models, it is often assumed that noise is with a specific distribution. Although the theoretical analysis can be greatly facilitated, the model-fitting performance may be poor since the supposed noise distribution may deviate from real noise to a large extent. Meanwhile, the model is also expected to be robust in consideration of the complexity of real-world data. Without any assumption about noise, we propose in this paper a novel sparse regression method called MoG-Lasso to directly model noise in linear regression models via a mixture of Gaussian distributions (MoG). Meanwhile, the penalty is included as a part of the loss function of MoG-Lasso to enhance its ability to identify a sparse model. As for the parameters in MoG-Lasso, we present an efficient algorithm to estimate them via the EM (expectation maximization) and ADMM (alternating direction method of multipliers) algorithms. With some simulated and real data contaminated by complex noise, the experiments show that the novel model MoG-Lasso performs better than several other popular methods in both 'p>n' and 'p<n' situations, including Lasso, LAD-Lasso and Huber-Lasso.
引用
收藏
页码:1738 / 1755
页数:18
相关论文
共 50 条
  • [1] Self-adaptive robust nonlinear regression for unknown noise via mixture of Gaussians
    Wang, Haibo
    Wang, Yun
    Hu, Qinghua
    NEUROCOMPUTING, 2017, 235 : 274 - 286
  • [2] Robust Coordinate Descent Algorithm Robust Solution Path for High-dimensional Sparse Regression Modeling
    Park, H.
    Konishi, S.
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2016, 45 (01) : 115 - 129
  • [3] Robust and sparse estimators for linear regression models
    Smucler, Ezequiel
    Yohai, Victor J.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2017, 111 : 116 - 130
  • [4] Robust and sparse bridge regression
    Li, Bin
    Yu, Qingzhao
    STATISTICS AND ITS INTERFACE, 2009, 2 (04) : 481 - 491
  • [5] Tuning parameter selection in sparse regression modeling
    Hirose, Kei
    Tateishi, Shohei
    Konishi, Sadanori
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2013, 59 : 28 - 40
  • [6] Robust and sparse logistic regression
    Cornilly, Dries
    Tubex, Lise
    Van Aelst, Stefan
    Verdonck, Tim
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2024, 18 (03) : 663 - 679
  • [7] SPARSE STABLE OUTLIER-ROBUST REGRESSION WITH MINIMAX CONCAVE FUNCTION
    Suzuki, Kyohei
    Yukawa, Masahiro
    2022 IEEE 32ND INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2022,
  • [8] SPARSE AND ROBUST LINEAR REGRESSION: AN OPTIMIZATION ALGORITHM AND ITS STATISTICAL PROPERTIES
    Katayama, Shota
    Fujisawa, Hironori
    STATISTICA SINICA, 2017, 27 (03) : 1243 - 1264
  • [9] Adaptive Robust Noise Modeling of Sparse Representation for Bearing Fault Diagnosis
    An, Botao
    Wang, Shibin
    Yan, Ruqiang
    Li, Weihua
    Chen, Xuefeng
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
  • [10] Robust Sparse Regression with High-Breakdown Value
    Mu, Weiyan
    Xiong, Shifeng
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2015, 44 (05) : 1033 - 1043