Another look at linear programming for feature selection via methods of regularization

被引:4
|
作者
Yao, Yonggang [1 ]
Lee, Yoonkyung [2 ]
机构
[1] SAS Inst Inc, Cary, NC 27513 USA
[2] Ohio State Univ, Dept Stat, Columbus, OH 43210 USA
基金
美国国家科学基金会;
关键词
Grouped regularization; l(1)-norm penalty; Parametric linear programming; Quantile regression; Simplex method; Structured learning; Support vector machines; SUPPORT VECTOR MACHINES; INTERIOR-POINT METHODS; VARIABLE SELECTION; REGRESSION SHRINKAGE; ALGORITHM; PATH; DEVIATIONS;
D O I
10.1007/s11222-013-9408-2
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We consider statistical procedures for feature selection defined by a family of regularization problems with convex piecewise linear loss functions and penalties of l (1) nature. Many known statistical procedures (e.g. quantile regression and support vector machines with l (1)-norm penalty) are subsumed under this category. Computationally, the regularization problems are linear programming (LP) problems indexed by a single parameter, which are known as 'parametric cost LP' or 'parametric right-hand-side LP' in the optimization theory. Exploiting the connection with the LP theory, we lay out general algorithms, namely, the simplex algorithm and its variant for generating regularized solution paths for the feature selection problems. The significance of such algorithms is that they allow a complete exploration of the model space along the paths and provide a broad view of persistent features in the data. The implications of the general path-finding algorithms are outlined for several statistical procedures, and they are illustrated with numerical examples.
引用
收藏
页码:885 / 905
页数:21
相关论文
共 50 条
  • [31] Multi-label feature selection via manifold regularization and dependence maximization
    Huang, Rui
    Wu, Zhejun
    Pattern Recognition, 2021, 120
  • [32] Robust Flexible Feature Selection via Exclusive L21 Regularization
    Ming, Di
    Ding, Chris
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3158 - 3164
  • [33] Multi-label feature selection via manifold regularization and dependence maximization
    Huang, Rui
    Wu, Zhejun
    PATTERN RECOGNITION, 2021, 120
  • [34] Hierarchical Feature Selection with Recursive Regularization
    Zhao, Hong
    Zhu, Pengfei
    Wang, Ping
    Hu, Qinghua
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3483 - 3489
  • [35] Multi-label feature selection via robust flexible sparse regularization
    Li, Yonghao
    Hu, Liang
    Gao, Wanfu
    PATTERN RECOGNITION, 2023, 134
  • [36] Multi-label feature selection via constraint mapping space regularization
    Li, Bangna
    Zhang, Qingqing
    He, Xingshi
    ELECTRONIC RESEARCH ARCHIVE, 2024, 32 (04): : 2598 - 2620
  • [37] Multi-label feature selection via nonlinear mapping and manifold regularization
    Wang, Yan
    Wang, Changzhong
    Deng, Tingquan
    Li, Wenqi
    INFORMATION SCIENCES, 2025, 704
  • [38] Non-linear Feature Selection Based on Convolution Neural Networks with Sparse Regularization
    Wu, Wen-Bin
    Chen, Si-Bao
    Ding, Chris
    Luo, Bin
    COGNITIVE COMPUTATION, 2024, 16 (02) : 654 - 670
  • [39] Non-linear Feature Selection Based on Convolution Neural Networks with Sparse Regularization
    Wen-Bin Wu
    Si-Bao Chen
    Chris Ding
    Bin Luo
    Cognitive Computation, 2024, 16 : 654 - 670
  • [40] ANOTHER LOOK AT ANALYSIS OF LINEAR MODELS
    HOCKING, RR
    SPEED, FM
    ANNALS OF MATHEMATICAL STATISTICS, 1969, 40 (05): : 1881 - +