Another look at linear programming for feature selection via methods of regularization

被引:4
|
作者
Yao, Yonggang [1 ]
Lee, Yoonkyung [2 ]
机构
[1] SAS Inst Inc, Cary, NC 27513 USA
[2] Ohio State Univ, Dept Stat, Columbus, OH 43210 USA
基金
美国国家科学基金会;
关键词
Grouped regularization; l(1)-norm penalty; Parametric linear programming; Quantile regression; Simplex method; Structured learning; Support vector machines; SUPPORT VECTOR MACHINES; INTERIOR-POINT METHODS; VARIABLE SELECTION; REGRESSION SHRINKAGE; ALGORITHM; PATH; DEVIATIONS;
D O I
10.1007/s11222-013-9408-2
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We consider statistical procedures for feature selection defined by a family of regularization problems with convex piecewise linear loss functions and penalties of l (1) nature. Many known statistical procedures (e.g. quantile regression and support vector machines with l (1)-norm penalty) are subsumed under this category. Computationally, the regularization problems are linear programming (LP) problems indexed by a single parameter, which are known as 'parametric cost LP' or 'parametric right-hand-side LP' in the optimization theory. Exploiting the connection with the LP theory, we lay out general algorithms, namely, the simplex algorithm and its variant for generating regularized solution paths for the feature selection problems. The significance of such algorithms is that they allow a complete exploration of the model space along the paths and provide a broad view of persistent features in the data. The implications of the general path-finding algorithms are outlined for several statistical procedures, and they are illustrated with numerical examples.
引用
收藏
页码:885 / 905
页数:21
相关论文
共 50 条
  • [21] The Horseshoe-Like Regularization for Feature Subset Selection
    Bhadra, Anindya
    Datta, Jyotishka
    Polson, Nicholas G.
    Willard, Brandon T.
    SANKHYA-SERIES B-APPLIED AND INTERDISCIPLINARY STATISTICS, 2021, 83 (01): : 185 - 214
  • [22] Evolutionary feature selection via structure retention
    Pedrycz, W.
    Ahmad, S. S. Syed
    EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (15) : 11801 - 11807
  • [23] Static Optimal Sensor Selection via Linear Integer Programming: The Orthogonal Case
    Moon, Jun
    Basar, Tamer
    IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (07) : 953 - 957
  • [24] A feature selection method with feature ranking using genetic programming
    Liu, Guopeng
    Ma, Jianbin
    Hu, Tongle
    Gao, Xiaoying
    CONNECTION SCIENCE, 2022, 34 (01) : 1146 - 1168
  • [25] Interval Cost Feature Selection Using Multi-objective PSO and Linear Interval Programming
    Zhang, Yong
    Gong, Dunwei
    Rong, Miao
    Guo, Yinan
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2016, PT I, 2016, 9712 : 579 - 586
  • [26] A survey on feature selection methods
    Chandrashekar, Girish
    Sahin, Ferat
    COMPUTERS & ELECTRICAL ENGINEERING, 2014, 40 (01) : 16 - 28
  • [27] Group-penalized feature selection and robust twin SVM classification via second-order cone programming
    Lopez, Julio
    Maldonado, Sebastian
    NEUROCOMPUTING, 2017, 235 : 112 - 121
  • [28] Simultaneous classification and feature selection via convex quadratic programming with application to HIV-associated neurocognitive disorder assessment
    Dunbar, Michelle
    Murray, John M.
    Cysique, Lucette A.
    Brew, Bruce J.
    Jeyakumar, Vaithilingam
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2010, 206 (02) : 470 - 478
  • [29] Simultaneous Model Selection and Feature Selection via BYY Harmony Learning
    Wang, Hongyan
    Ma, Jinwen
    ADVANCES IN NEURAL NETWORKS - ISNN 2011, PT II, 2011, 6676 : 47 - +
  • [30] Privacy-preserving linear and nonlinear approximation via linear programming
    Fung, Glenn M.
    Mangasarian, Olvi L.
    OPTIMIZATION METHODS & SOFTWARE, 2013, 28 (01): : 207 - 216