The Impact of Regularization on High-dimensional Logistic Regression

被引:0
|
作者
Salehi, Fariborz [1 ]
Abbasi, Ehsan [1 ]
Hassibi, Babak [1 ]
机构
[1] CALTECH, Dept Elect Engn, Pasadena, CA 91125 USA
基金
美国国家科学基金会;
关键词
GENERALIZED LINEAR-MODELS; SELECTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Logistic regression is commonly used for modeling dichotomous outcomes. In the classical setting, where the number of observations is much larger than the number of parameters, properties of the maximum likelihood estimator in logistic regression are well understood. Recently, Sur and Candes [26] have studied logistic regression in the high-dimensional regime, where the number of observations and parameters are comparable, and show, among other things, that the maximum likelihood estimator is biased. In the high-dimensional regime the underlying parameter vector is often structured (sparse, block-sparse, finite-alphabet, etc.) and so in this paper we study regularized logistic regression (RLR), where a convex regularizer that encourages the desired structure is added to the negative of the log-likelihood function. An advantage of RLR is that it allows parameter recovery even for instances where the (unconstrained) maximum likelihood estimate does not exist. We provide a precise analysis of the performance of RLR via the solution of a system of six nonlinear equations, through which any performance metric of interest (mean, mean-squared error, probability of support recovery, etc.) can be explicitly computed. Our results generalize those of Sur and Candes and we provide a detailed study for the cases of l(2)(2)-RLR and sparse (l(1)-regularized) logistic regression. In both cases, we obtain explicit expressions for various performance metrics and can find the values of the regularizer parameter that optimizes the desired performance. The theory is validated by extensive numerical simulations across a range of parameter values and problem instances.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Penalized logistic regression with the adaptive LASSO for gene selection in high-dimensional cancer classification
    Algamal, Zakariya Yahya
    Lee, Muhammad Hisyam
    EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (23) : 9326 - 9332
  • [42] High-dimensional pseudo-logistic regression and classification with applications to gene expression data
    Zhang, Chunming
    Fu, Haoda
    Jiang, Yuan
    Yu, Tao
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 52 (01) : 452 - 470
  • [43] Sparse Bayesian variable selection in high-dimensional logistic regression models with correlated priors
    Ma, Zhuanzhuan
    Han, Zifei
    Ghosh, Souparno
    Wu, Liucang
    Wang, Min
    STATISTICAL ANALYSIS AND DATA MINING, 2024, 17 (01)
  • [44] Using synthetic data and dimensionality reduction in high-dimensional classification via logistic regression
    Zarei, Shaho
    Mohammadpour, Adel
    COMPUTATIONAL METHODS FOR DIFFERENTIAL EQUATIONS, 2019, 7 (04): : 626 - 634
  • [45] Converting high-dimensional regression to high-dimensional conditional density estimation
    Izbicki, Rafael
    Lee, Ann B.
    ELECTRONIC JOURNAL OF STATISTICS, 2017, 11 (02): : 2800 - 2831
  • [46] Regularization techniques for high-dimensional data analysis
    Lu, Jiwen
    Peng, Xi
    Deng, Weihong
    Mian, Ajmal
    IMAGE AND VISION COMPUTING, 2017, 60 : 1 - 3
  • [47] Modified Regularization for High-dimensional Data Decomposition
    Chai, Sheng
    Feng, Wenying
    Hassanein, Hossam
    2022 IEEE/WIC/ACM INTERNATIONAL JOINT CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY, WI-IAT, 2022, : 710 - 714
  • [48] Robust regularization for high-dimensional Cox's regression model using weighted likelihood criterion
    Wahid, Abdul
    Khan, Dost Muhammad
    Khan, Sajjad Ahmad
    Hussain, Ijaz
    Khan, Zardad
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2021, 213
  • [49] A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression
    Zhou, Jing
    Claeskens, Gerda
    ELECTRONIC JOURNAL OF STATISTICS, 2024, 18 (01): : 395 - 428
  • [50] HIGH-DIMENSIONAL ISING MODEL SELECTION USING l1-REGULARIZED LOGISTIC REGRESSION
    Ravikumar, Pradeep
    Wainwright, Martin J.
    Lafferty, John D.
    ANNALS OF STATISTICS, 2010, 38 (03): : 1287 - 1319