Structured variable selection in support vector machines

被引:8
|
作者
Wu, Seongho [1 ]
Zou, Hui [1 ]
Yuan, Ming [2 ]
机构
[1] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
[2] Georgia Inst Technol, Sch Ind & Syst Engn, Atlanta, GA 30332 USA
来源
ELECTRONIC JOURNAL OF STATISTICS | 2008年 / 2卷
基金
美国国家科学基金会;
关键词
Classification; Heredity; Nonparametric estimation; Support vector machine; Variable selection;
D O I
10.1214/07-EJS125
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
When applying the support vector machine (SVM) to high-dimensional classification problems, we often impose a sparse structure in the SVM to eliminate the influences of the irrelevant predictors. The lasso and other variable selection techniques have been successfully used in the SVM to perform automatic variable selection. In some problems, there is a natural hierarchical structure among the variables. Thus, in order to have an interpretable SVM classifier, it is important to respect the heredity principle when enforcing the sparsity in the SVM. Many variable selection methods, however, do not respect the heredity principle. In this paper we enforce both sparsity and the heredity principle in the SVM by using the so-called structured variable selection (SVS) framework originally proposed in [20]. We minimize the empirical hinge loss under a set of linear inequality constraints and a lasso-type penalty. The solution always obeys the desired heredity principle and enjoys sparsity. The new SVM classifier can be efficiently fitted, because the optimization problem is a linear program. Another contribution of this work is to present a nonparametric extension of the SVS framework, and we propose nonparametric heredity SVMs. Simulated and real data are used to illustrate the merits of the proposed method.
引用
收藏
页码:103 / 117
页数:15
相关论文
共 50 条
  • [21] AUC Maximizing Support Vector Machines with Feature Selection
    Tian, Yingjie
    Shi, Yong
    Chen, Xiaojun
    Chen, Wenjing
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE (ICCS), 2011, 4 : 1691 - 1698
  • [22] Feature selection for support vector machines with RBF kernel
    Liu, Quanzhong
    Chen, Chihau
    Zhang, Yang
    Hu, Zhengguo
    ARTIFICIAL INTELLIGENCE REVIEW, 2011, 36 (02) : 99 - 115
  • [23] Clustering model selection for reduced support vector machines
    Jen, LR
    Lee, YJ
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING IDEAL 2004, PROCEEDINGS, 2004, 3177 : 714 - 719
  • [24] Optimal kernel selection in twin support vector machines
    Khemchandani, Reshma
    Jayadeva
    Chandra, Suresh
    OPTIMIZATION LETTERS, 2009, 3 (01) : 77 - 88
  • [25] Research on parameter selection method for support vector machines
    Sun, Ling
    Bao, Jian
    Chen, Yangyang
    Yang, Mingming
    APPLIED INTELLIGENCE, 2018, 48 (02) : 331 - 342
  • [26] Variable selection for the linear support vector machine
    Zhu, Ji
    Zou, Hui
    TRENDS IN NEURAL COMPUTATION, 2007, 35 : 35 - +
  • [27] Research on parameter selection method for support vector machines
    Ling Sun
    Jian Bao
    Yangyang Chen
    Mingming Yang
    Applied Intelligence, 2018, 48 : 331 - 342
  • [28] Support vector machines with adaptive Lq penalty
    Liu, Yufeng
    Zhang, Hao Helen
    Park, Cheolwoo
    Ahn, Jeongyoun
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 51 (12) : 6380 - 6394
  • [29] Least Squares Support Vector Machines With Variable Selection and Hyperparameter Optimization for Complex Structures Reliability Assessment
    Dong, Xiaowei
    Zhang, Hao
    Li, Zhenao
    Zhu, Chunyan
    Yi, Shujuan
    Chen, Changhai
    QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, 2025,
  • [30] Field Support Vector Machines
    Huang, Kaizhu
    Jiang, Haochuan
    Zhang, Xu-Yao
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2017, 1 (06): : 454 - 463