Sparse Support Vector Machine with Lp Penalty for Feature Selection

被引:0
作者
Lan Yao
Feng Zeng
Dong-Hui Li
Zhi-Gang Chen
机构
[1] Hunan University,College of Mathematics and Econometrics
[2] Central South University,School of Software
[3] South China Normal University,School of Mathematical Sciences
来源
Journal of Computer Science and Technology | 2017年 / 32卷
关键词
machine learning; feature selection; support vector machine; -regularization;
D O I
暂无
中图分类号
学科分类号
摘要
We study the strategies in feature selection with sparse support vector machine (SVM). Recently, the socalled Lp-SVM (0 < p < 1) has attracted much attention because it can encourage better sparsity than the widely used L1-SVM. However, Lp-SVM is a non-convex and non-Lipschitz optimization problem. Solving this problem numerically is challenging. In this paper, we reformulate the Lp-SVM into an optimization model with linear objective function and smooth constraints (LOSC-SVM) so that it can be solved by numerical methods for smooth constrained optimization. Our numerical experiments on artificial datasets show that LOSC-SVM (0 < p < 1) can improve the classification performance in both feature selection and classification by choosing a suitable parameter p. We also apply it to some real-life datasets and experimental results show that it is superior to L1-SVM.
引用
收藏
页码:68 / 77
页数:9
相关论文
empty
未找到相关数据