Dual extrapolation for sparse generalized linear models

被引:0
|
作者
Massias, Mathurin [1 ]
Vaiter, Samuel [2 ]
Gramfort, Alexandre [1 ]
Salmon, Joseph [3 ]
机构
[1] Université Paris-Saclay, Inria, CEA, Palaiseau, France
[2] CNRS, Institut de Mathématiques de Bourgogne, Dijon,21078, France
[3] IMAG, Univ Montpellier, CNRS, Montpellier,34095, France
来源
关键词
Convex optimization - Regression analysis - Gradient methods;
D O I
暂无
中图分类号
学科分类号
摘要
Generalized Linear Models (GLM) form a wide class of regression and classification models, where prediction is a function of a linear combination of the input variables. For statistical inference in high dimension, sparsity inducing regularizations have proven to be useful while offering statistical guarantees. However, solving the resulting optimization problems can be challenging: even for popular iterative algorithms such as coordinate descent, one needs to loop over a large number of variables. To mitigate this, techniques known as screening rules and working sets diminish the size of the optimization problem at hand, either by progressively removing variables, or by solving a growing sequence of smaller problems. For both techniques, significant variables are identified thanks to convex duality arguments. In this paper, we show that the dual iterates of a GLM exhibit a Vector AutoRegressive (VAR) behavior after sign identification, when the primal problem is solved with proximal gradient descent or cyclic coordinate descent. Exploiting this regularity, one can construct dual points that offer tighter certificates of optimality, enhancing the performance of screening rules and working set algorithms. © 2020 Mathurin Massias, Samuel Vaiter, Alexandre Gramfort and Joseph Salmon. License: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided at http://jmlr.org/papers/v21/19-587.html.
引用
收藏
相关论文
共 50 条
  • [21] Use generalized linear models or generalized partially linear models?
    Xinmin Li
    Haozhe Liang
    Wolfgang Härdle
    Hua Liang
    Statistics and Computing, 2023, 33
  • [22] Empirical Bayes inference in sparse high-dimensional generalized linear models
    Tang, Yiqi
    Martin, Ryan
    ELECTRONIC JOURNAL OF STATISTICS, 2024, 18 (02): : 3212 - 3246
  • [23] Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models
    Juan C. Laria
    M. Carmen Aguilera-Morillo
    Rosa E. Lillo
    Statistical Papers, 2023, 64 : 227 - 253
  • [24] Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models
    Laria, Juan C.
    Carmen Aguilera-Morillo, M.
    Lillo, Rosa E.
    STATISTICAL PAPERS, 2023, 64 (01) : 227 - 253
  • [25] Sparse Linear Isotonic Models
    Chen, Sheng
    Banerjee, Arindam
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [26] Generalized linear models
    Neuhaus, John
    McCulloch, Charles
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2011, 3 (05) : 407 - 413
  • [27] Generalized linear models
    McCulloch, CE
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2000, 95 (452) : 1320 - 1324
  • [28] Generalized linear models
    Burzykowski, Tomasz
    Geubbelmans, Melvin
    Rousseau, Axel-Jan
    Valkenborg, Dirk
    AMERICAN JOURNAL OF ORTHODONTICS AND DENTOFACIAL ORTHOPEDICS, 2023, 164 (04) : 604 - 606
  • [29] GENERALIZED LINEAR MODELS
    NELDER, JA
    WEDDERBURN, RW
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES A-GENERAL, 1972, 135 (03): : 370 - +
  • [30] Generalized linear models
    Zezula, Ivan
    BIOMETRIC METHODS AND MODELS IN CURRENT SCIENCE AND RESEARCH, 2011, : 39 - 58