Bayesian L1/2 Regression

被引:0
作者
Ke, Xiongwen [1 ,2 ]
Fan, Yanan [2 ,3 ]
机构
[1] Cent South Univ, Sch Math & Stat, Changsha, Hunan, Peoples R China
[2] UNSW, Sch Math & Stat, Sydney 2052, Australia
[3] CSIRO, Data61, Sydney, Australia
关键词
Bridge shrinkage; High dimensional regression; MCMC; Sparse optimization; NONCONCAVE PENALIZED LIKELIHOOD; COLLAPSED GIBBS SAMPLERS; ASYMPTOTIC PROPERTIES; ANTIFREEZE PROTEIN; VARIABLE-SELECTION; LINEAR-REGRESSION;
D O I
10.1080/10618600.2024.2374579
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
It is well known that Bridge regression enjoys superior theoretical properties when compared to traditional LASSO. However, the current latent variable representation of its Bayesian counterpart, based on the exponential power prior, is computationally expensive in higher dimensions. In this article, we show that the exponential power prior has a closed form scale mixture of normal decomposition for alpha=(1/2)(gamma),gamma is an element of{1,2,& mldr;} . We call these types of priors L-1/2 prior for short. We develop an efficient partially collapsed Gibbs sampling scheme for computation using the L-1/2 prior and study theoretical properties when p>n . In addition, we introduce a non-separable Bridge penalty function inspired by the fully Bayesian formulation and a novel, efficient coordinate descent algorithm. We prove the algorithm's convergence and show that the local minimizer from our optimization algorithm has an oracle property. Finally, simulation studies were carried out to illustrate the performance of the new algorithms. Supplementary materials for this article are available online.
引用
收藏
页码:199 / 210
页数:12
相关论文
共 50 条
  • [31] Ordered Weighted l1 Regularized Regression with Strongly Correlated Covariates: Theoretical Aspects
    Figueiredo, Mario A. T.
    Nowak, Robert D.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 930 - 938
  • [32] Approximate Leave-One-Out Cross Validation for Regression With l1 Regularizers
    Auddy, Arnab
    Zou, Haolin
    Rad, Kamiar Rahnama
    Maleki, Arian
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (11) : 8040 - 8071
  • [33] High-dimensional QSAR modelling using penalized linear regression model with L1/2-norm
    Algamal, Z. Y.
    Lee, M. H.
    Al-Fakih, A. M.
    Aziz, M.
    SAR AND QSAR IN ENVIRONMENTAL RESEARCH, 2016, 27 (09) : 703 - 719
  • [34] Bayesian Regression Using a Prior on the Model Fit: The R2-D2 Shrinkage Prior
    Zhang, Yan Dora
    Naughton, Brian P.
    Bondell, Howard D.
    Reich, Brian J.
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2022, 117 (538) : 862 - 874
  • [35] Bayesian bridge regression
    Mallick, Himel
    Yi, Nengjun
    JOURNAL OF APPLIED STATISTICS, 2018, 45 (06) : 988 - 1008
  • [36] l1 Trend Filtering
    Kim, Seung-Jean
    Koh, Kwangmoo
    Boyd, Stephen
    Gorinevsky, Dimitry
    SIAM REVIEW, 2009, 51 (02) : 339 - 360
  • [37] Asymptotic properties for combined L1 and concave regularization
    Fan, Yingying
    Lv, Jinchi
    BIOMETRIKA, 2014, 101 (01) : 57 - 70
  • [38] Speech Emotion Recognition System Based on L1 Regularized Linear Regression and Decision Fusion
    Cen, Ling
    Yu, Zhu Liang
    Dong, Ming Hui
    AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION, PT II, 2011, 6975 : 332 - +
  • [39] Sorted L1/L2 Minimization for Sparse Signal Recovery
    Wang, Chao
    Yan, Ming
    Yu, Junjie
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 99 (02)
  • [40] A novel L1/2 regularization shooting method for Cox's proportional hazards model
    Luan, Xin-Ze
    Liang, Yong
    Liu, Cheng
    Leung, Kwong-Sak
    Chan, Tak-Ming
    Xu, Zong-Ben
    Zhang, Hai
    SOFT COMPUTING, 2014, 18 (01) : 143 - 152