Another look at linear programming for feature selection via methods of regularization

被引:4
|
作者
Yao, Yonggang [1 ]
Lee, Yoonkyung [2 ]
机构
[1] SAS Inst Inc, Cary, NC 27513 USA
[2] Ohio State Univ, Dept Stat, Columbus, OH 43210 USA
基金
美国国家科学基金会;
关键词
Grouped regularization; l(1)-norm penalty; Parametric linear programming; Quantile regression; Simplex method; Structured learning; Support vector machines; SUPPORT VECTOR MACHINES; INTERIOR-POINT METHODS; VARIABLE SELECTION; REGRESSION SHRINKAGE; ALGORITHM; PATH; DEVIATIONS;
D O I
10.1007/s11222-013-9408-2
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We consider statistical procedures for feature selection defined by a family of regularization problems with convex piecewise linear loss functions and penalties of l (1) nature. Many known statistical procedures (e.g. quantile regression and support vector machines with l (1)-norm penalty) are subsumed under this category. Computationally, the regularization problems are linear programming (LP) problems indexed by a single parameter, which are known as 'parametric cost LP' or 'parametric right-hand-side LP' in the optimization theory. Exploiting the connection with the LP theory, we lay out general algorithms, namely, the simplex algorithm and its variant for generating regularized solution paths for the feature selection problems. The significance of such algorithms is that they allow a complete exploration of the model space along the paths and provide a broad view of persistent features in the data. The implications of the general path-finding algorithms are outlined for several statistical procedures, and they are illustrated with numerical examples.
引用
收藏
页码:885 / 905
页数:21
相关论文
共 50 条
  • [41] Selection and Ordering Policies for Hiring Pipelines via Linear Programming
    Epstein, Boris
    Ma, Will
    OPERATIONS RESEARCH, 2024, 72 (05) : 2000 - 2013
  • [42] FSOCP: feature selection via second-order cone programming
    Guldogus, Buse Cisil
    Ozogur-Akyuz, Suereyya
    CENTRAL EUROPEAN JOURNAL OF OPERATIONS RESEARCH, 2025, 33 (01) : 51 - 64
  • [43] A neurodynamic optimization approach to supervised feature selection via fractional programming
    Wang, Yadi
    Li, Xiaoping
    Wang, Jun
    NEURAL NETWORKS, 2021, 136 : 194 - 206
  • [44] Feature Selection for Consistent Biclustering via Fractional 0–1 Programming
    Stanislav Busygin
    Oleg A. Prokopyev
    Panos M. Pardalos
    Journal of Combinatorial Optimization, 2005, 10 : 7 - 21
  • [45] Quadratic Programming Feature Selection
    Rodriguez-Lujan, Irene
    Huerta, Ramon
    Elkan, Charles
    Santa Cruz, Carlos
    JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 : 1491 - 1516
  • [46] Deep Neural Networks Regularization Using a Combination of Sparsity Inducing Feature Selection Methods
    Farokhmanesh, Fatemeh
    Sadeghi, Mohammad Taghi
    NEURAL PROCESSING LETTERS, 2021, 53 (01) : 701 - 720
  • [47] Quadratic programming feature selection
    Rodriguez-Lujan, Irene
    Huerta, Ramon
    Elkan, Charles
    Cruz, Carlos Santa
    Journal of Machine Learning Research, 2010, 11 : 1491 - 1516
  • [48] Deep Neural Networks Regularization Using a Combination of Sparsity Inducing Feature Selection Methods
    Fatemeh Farokhmanesh
    Mohammad Taghi Sadeghi
    Neural Processing Letters, 2021, 53 : 701 - 720
  • [49] Revealing metabolite biomarkers for acupuncture treatment by linear programming based feature selection
    Wang, Yong
    Wu, Qiao-Feng
    Chen, Chen
    Wu, Ling-Yun
    Yan, Xian-Zhong
    Yu, Shu-Guang
    Zhang, Xiang-Sun
    Liang, Fan-Rong
    BMC SYSTEMS BIOLOGY, 2012, 6
  • [50] Feature selection with linear programming support vector machines and applications to tornado prediction
    Trafalis, T.B.
    Santosa, B.
    Richman, T.B.
    WSEAS Transactions on Computers, 2005, 4 (08): : 865 - 873