Large Margin Feature Selection for Support Vector Machine

被引:1
作者
Pan, Wei [1 ]
Ma, Peijun [1 ]
Su, Xiaohong [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150006, Heilongjiang, Peoples R China
来源
MECHANICAL ENGINEERING, MATERIALS SCIENCE AND CIVIL ENGINEERING | 2013年 / 274卷
关键词
Margin; Feature selection; Support Vector Machine;
D O I
10.4028/www.scientific.net/AMM.274.161
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Feature selection is an preprocessing step in pattern analysis and machine learning. In this paper, we design a algorithm for feature subset. We present L1-norm regularization technique for sparse feature weight. Margin loss are introduced to evaluate features, and we employs gradient descent to search the optimal solution to maximize margin. The proposed technique is tested on UCI data sets. Compared with four margin based loss functions for SVM, the proposed technique is effective and efficient.
引用
收藏
页码:161 / 164
页数:4
相关论文
共 8 条
[1]  
Crammer K., 2003, Proceedings of the 17th Conference on Neural Information Processing Systems, V15, P462
[2]   An adaptive version of the boost by majority algorithm [J].
Freund, Y .
MACHINE LEARNING, 2001, 43 (03) :293-318
[3]  
Gilad-Bachrach R., 2004, P 21 INT C MACH LEAR, P40
[4]  
Kohavi R., 1997, ARTIF INTELL, P234
[5]   Toward integrating feature selection algorithms for classification and clustering [J].
Liu, H ;
Yu, L .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2005, 17 (04) :491-502
[6]  
Ng Andrew Y., 2004, P 21 INT C MACH LEAR, P35
[7]   Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy [J].
Peng, HC ;
Long, FH ;
Ding, C .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (08) :1226-1238
[8]   A review of feature selection techniques in bioinformatics [J].
Saeys, Yvan ;
Inza, Inaki ;
Larranaga, Pedro .
BIOINFORMATICS, 2007, 23 (19) :2507-2517