Bayesian L1/2 Regression

被引:0
作者
Ke, Xiongwen [1 ,2 ]
Fan, Yanan [2 ,3 ]
机构
[1] Cent South Univ, Sch Math & Stat, Changsha, Hunan, Peoples R China
[2] UNSW, Sch Math & Stat, Sydney 2052, Australia
[3] CSIRO, Data61, Sydney, Australia
关键词
Bridge shrinkage; High dimensional regression; MCMC; Sparse optimization; NONCONCAVE PENALIZED LIKELIHOOD; COLLAPSED GIBBS SAMPLERS; ASYMPTOTIC PROPERTIES; ANTIFREEZE PROTEIN; VARIABLE-SELECTION; LINEAR-REGRESSION;
D O I
10.1080/10618600.2024.2374579
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
It is well known that Bridge regression enjoys superior theoretical properties when compared to traditional LASSO. However, the current latent variable representation of its Bayesian counterpart, based on the exponential power prior, is computationally expensive in higher dimensions. In this article, we show that the exponential power prior has a closed form scale mixture of normal decomposition for alpha=(1/2)(gamma),gamma is an element of{1,2,& mldr;} . We call these types of priors L-1/2 prior for short. We develop an efficient partially collapsed Gibbs sampling scheme for computation using the L-1/2 prior and study theoretical properties when p>n . In addition, we introduce a non-separable Bridge penalty function inspired by the fully Bayesian formulation and a novel, efficient coordinate descent algorithm. We prove the algorithm's convergence and show that the local minimizer from our optimization algorithm has an oracle property. Finally, simulation studies were carried out to illustrate the performance of the new algorithms. Supplementary materials for this article are available online.
引用
收藏
页码:199 / 210
页数:12
相关论文
共 50 条
  • [41] Penalized high-dimensional M-quantile regression: From L1 to Lp optimization
    Hu, Jie
    Chen, Yu
    Zhang, Weiping
    Guo, Xiao
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2021, 49 (03): : 875 - 905
  • [42] The Smooth-Lasso and other l1 + l2-penalized methods
    Hebiri, Mohamed
    van de Geer, Sara
    ELECTRONIC JOURNAL OF STATISTICS, 2011, 5 : 1184 - 1226
  • [43] L1-2 Regularized Logistic Regression
    Qin, Jing
    Lou, Yifei
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 779 - 783
  • [44] Bayesian bridge quantile regression
    Alhamzawi, Rahim
    Algamal, Zakariya Yahya
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2019, 48 (03) : 944 - 956
  • [45] MULTIPLE-INFLATION POISSON MODEL WITH L1 REGULARIZATION
    Su, Xiaogang
    Fan, Juanjuan
    Levine, Richard A.
    Tan, Xianming
    Tripathi, Arvind
    STATISTICA SINICA, 2013, 23 (03) : 1071 - 1090
  • [46] The Bayesian adaptive lasso regression
    Alhamzawi, Rahim
    Ali, Haithem Taha Mohammad
    MATHEMATICAL BIOSCIENCES, 2018, 303 : 75 - 82
  • [47] Gene Selection based on Fuzzy measure with L1 regularization
    Wang, Jinfeng
    Chen, Jiajie
    Wang, Hui
    2018 21ST IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (CSE 2018), 2018, : 157 - 163
  • [48] The L1/2 regularization method for variable selection in the Cox model
    Liu, Cheng
    Liang, Yong
    Luan, Xin-Ze
    Leung, Kwong-Sak
    Chan, Tak-Ming
    Xu, Zong-Ben
    Zhang, Hai
    APPLIED SOFT COMPUTING, 2014, 14 : 498 - 503
  • [49] Robust variable selection for the varying coefficient model based on composite L1-L2 regression
    Zhao, Weihua
    Zhang, Riquan
    Liu, Jicai
    JOURNAL OF APPLIED STATISTICS, 2013, 40 (09) : 2024 - 2040
  • [50] L1 Regularization for High-Dimensional Multivariate GARCH Models
    Yao, Sijie
    Zou, Hui
    Xing, Haipeng
    RISKS, 2024, 12 (02)