Bayesian L1/2 Regression

被引:0
作者
Ke, Xiongwen [1 ,2 ]
Fan, Yanan [2 ,3 ]
机构
[1] Cent South Univ, Sch Math & Stat, Changsha, Hunan, Peoples R China
[2] UNSW, Sch Math & Stat, Sydney 2052, Australia
[3] CSIRO, Data61, Sydney, Australia
关键词
Bridge shrinkage; High dimensional regression; MCMC; Sparse optimization; NONCONCAVE PENALIZED LIKELIHOOD; COLLAPSED GIBBS SAMPLERS; ASYMPTOTIC PROPERTIES; ANTIFREEZE PROTEIN; VARIABLE-SELECTION; LINEAR-REGRESSION;
D O I
10.1080/10618600.2024.2374579
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
It is well known that Bridge regression enjoys superior theoretical properties when compared to traditional LASSO. However, the current latent variable representation of its Bayesian counterpart, based on the exponential power prior, is computationally expensive in higher dimensions. In this article, we show that the exponential power prior has a closed form scale mixture of normal decomposition for alpha=(1/2)(gamma),gamma is an element of{1,2,& mldr;} . We call these types of priors L-1/2 prior for short. We develop an efficient partially collapsed Gibbs sampling scheme for computation using the L-1/2 prior and study theoretical properties when p>n . In addition, we introduce a non-separable Bridge penalty function inspired by the fully Bayesian formulation and a novel, efficient coordinate descent algorithm. We prove the algorithm's convergence and show that the local minimizer from our optimization algorithm has an oracle property. Finally, simulation studies were carried out to illustrate the performance of the new algorithms. Supplementary materials for this article are available online.
引用
收藏
页码:199 / 210
页数:12
相关论文
共 50 条
  • [1] Bayesian tobit quantile regression with L1/2 penalty
    Alhamzawi, Rahim
    Ali, Haithem Taha Mohammad
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2018, 47 (06) : 1739 - 1750
  • [2] Bayesian joint inference for multivariate quantile regression model with L1/2 penalty
    Tian, Yu-Zhu
    Tang, Man-Lai
    Tian, Mao-Zai
    COMPUTATIONAL STATISTICS, 2021, 36 (04) : 2967 - 2994
  • [3] Bayesian relative composite quantile regression approach of ordinal latent regression model with L1/2 regularization
    Yu-Zhu, Tian
    Chun-Ho, Wu
    Ling-Nan, Tai
    Zhi-Bao, Mian
    Mao-Zai, Tian
    STATISTICAL ANALYSIS AND DATA MINING, 2024, 17 (02)
  • [4] A Survey of L1 Regression
    Vidaurre, Diego
    Bielza, Concha
    Larranaga, Pedro
    INTERNATIONAL STATISTICAL REVIEW, 2013, 81 (03) : 361 - 387
  • [5] Fully Bayesian L1/2-penalized linear quantile regression analysis with autoregressive errors
    Tian, Yuzhu
    Song, Xinyuan
    STATISTICS AND ITS INTERFACE, 2020, 13 (03) : 271 - 286
  • [6] Honest variable selection in linear and logistic regression models via l1 and l1 + l2 penalization
    Bunea, Florentina
    ELECTRONIC JOURNAL OF STATISTICS, 2008, 2 : 1153 - 1194
  • [7] A novel l1/2 sparse regression method for hyperspectral unmixing
    Sun, Le
    Wu, Zebin
    Xiao, Liang
    Liu, Jianjun
    Wei, Zhihui
    Dang, Fuxing
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2013, 34 (20) : 6983 - 7001
  • [8] The L1 penalized LAD estimator for high dimensional linear regression
    Wang, Lie
    JOURNAL OF MULTIVARIATE ANALYSIS, 2013, 120 : 135 - 151
  • [9] Dealing with the multiplicity of solutions of the l1 and l∞ regression models
    Castillo, Enrique
    Minguez, Roberto
    Castillo, Carmen
    Cofino, Antonio S.
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2008, 188 (02) : 460 - 484
  • [10] Bayesian model selection:: a predictive approach with losses based on distances L1 and L2
    de la Horra, J
    Rodríguez-Bernal, MT
    STATISTICS & PROBABILITY LETTERS, 2005, 71 (03) : 257 - 265