Support vector machines ensemble with optimizing weights by genetic algorithm

被引:0
作者
He, Ling-Min [1 ,2 ]
Yang, Xiao-Bing [1 ]
Kong, Fan-Sheng [2 ]
机构
[1] China Jiliang Univ, Coll Informat Engn, Hangzhou 310018, Peoples R China
[2] Zhejiang Univ, Artificial Intelligence Inst, Hangzhou 310027, Zhejiang, Peoples R China
来源
PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7 | 2006年
关键词
support vector machines; genetic algorithm; ensemble; classification;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Support Vector Machines (SVM) is a classification technique based on the structural risk minimization principle. It is characteristic of processing complex data and high accuracy. And the ensemble of classifiers often has better performance than any of component classifiers in the ensemble. In this paper, bagging, boosting, multiple SVM decision model (MSDM) and heterogeneous SVM decision model (HSDM) of SVM ensemble are compared on four data sets. For boosting and bagging, genetic algorithm is used to optimize the combining weights of component SVMs. Experiment results show that SVM ensemble with optimizing weights by genetic algorithm could gain higher accuracy.
引用
收藏
页码:3503 / +
页数:2
相关论文
共 15 条
[11]  
Platt JC, 2000, ADV NEUR IN, V12, P547
[12]  
Shen ZQ, 2004, LECT NOTES COMPUT SC, V3173, P323
[13]  
Vapnik VN, 1998, NATURE STAT LEARNING
[14]  
Yan Wei-wu, 2002, Journal of Shanghai Jiaotong University (English Edition), VE-7, P220
[15]   Ensembling neural networks: Many could be better than all [J].
Zhou, ZH ;
Wu, JX ;
Tang, W .
ARTIFICIAL INTELLIGENCE, 2002, 137 (1-2) :239-263