Bayesian L1/2 Regression

被引:0
作者
Ke, Xiongwen [1 ,2 ]
Fan, Yanan [2 ,3 ]
机构
[1] Cent South Univ, Sch Math & Stat, Changsha, Hunan, Peoples R China
[2] UNSW, Sch Math & Stat, Sydney 2052, Australia
[3] CSIRO, Data61, Sydney, Australia
关键词
Bridge shrinkage; High dimensional regression; MCMC; Sparse optimization; NONCONCAVE PENALIZED LIKELIHOOD; COLLAPSED GIBBS SAMPLERS; ASYMPTOTIC PROPERTIES; ANTIFREEZE PROTEIN; VARIABLE-SELECTION; LINEAR-REGRESSION;
D O I
10.1080/10618600.2024.2374579
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
It is well known that Bridge regression enjoys superior theoretical properties when compared to traditional LASSO. However, the current latent variable representation of its Bayesian counterpart, based on the exponential power prior, is computationally expensive in higher dimensions. In this article, we show that the exponential power prior has a closed form scale mixture of normal decomposition for alpha=(1/2)(gamma),gamma is an element of{1,2,& mldr;} . We call these types of priors L-1/2 prior for short. We develop an efficient partially collapsed Gibbs sampling scheme for computation using the L-1/2 prior and study theoretical properties when p>n . In addition, we introduce a non-separable Bridge penalty function inspired by the fully Bayesian formulation and a novel, efficient coordinate descent algorithm. We prove the algorithm's convergence and show that the local minimizer from our optimization algorithm has an oracle property. Finally, simulation studies were carried out to illustrate the performance of the new algorithms. Supplementary materials for this article are available online.
引用
收藏
页码:199 / 210
页数:12
相关论文
共 50 条
  • [21] Krylov subspace solvers for l1 regularized logistic regression method
    El Guide, M.
    Jbilou, K.
    Koukouvinos, C.
    Lappa, A.
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2023, 52 (06) : 2738 - 2751
  • [22] Group analysis of fMRI data using L1 and L2 regularization
    Overholser, Rosanna
    Xu, Ronghui
    STATISTICS AND ITS INTERFACE, 2015, 8 (03) : 379 - 390
  • [23] Feature Selection and Cancer Classification via Sparse Logistic Regression with the Hybrid L1/2+2 Regularization
    Huang, Hai-Hui
    Liu, Xiao-Ying
    Liang, Yong
    PLOS ONE, 2016, 11 (05):
  • [24] Comparative study of L1 regularized logistic regression methods for variable selection
    El Guide, M.
    Jbilou, K.
    Koukouvinos, C.
    Lappa, A.
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2022, 51 (09) : 4957 - 4972
  • [25] Some sharp performance bounds for least squares regression with L1 regularization
    Zhang, Tong
    ANNALS OF STATISTICS, 2009, 37 (5A) : 2109 - 2144
  • [26] Scalable Sparse Subspace Clustering via Ordered Weighted l1 Regression
    Oswal, Urvashi
    Nowak, Robert
    2018 56TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2018, : 305 - 312
  • [27] Sparse l1 and l2 Center Classifiers
    Calafiore, Giuseppe C.
    Fracastoro, Giulia
    IFAC PAPERSONLINE, 2020, 53 (02): : 518 - 523
  • [28] Penalized logistic regression based on L1/2 penalty for high-dimensional DNA methylation data
    Jiang, Hong-Kun
    Liang, Yong
    TECHNOLOGY AND HEALTH CARE, 2020, 28 : S161 - S171
  • [29] L1 Correlation-Based Penalty in High-Dimensional Quantile Regression
    Yuzbasi, Bahadir
    Ahmed, S. Ejaz
    Asar, Yasin
    2018 4TH INTERNATIONAL CONFERENCE ON BIG DATA AND INFORMATION ANALYTICS (BIGDIA), 2018,
  • [30] A PRIMAL AND DUAL ACTIVE SET ALGORITHM FOR TRUNCATED L1 REGULARIZED LOGISTIC REGRESSION
    Kang, Lican
    Luo, Yuan
    Yang, Jerry Zhijian
    Zhu, Chang
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2023, 19 (04) : 2452 - 2463