A new boosting design of Support Vector Machine classifiers

被引:15
作者
Mayhua-Lopez, Efrain [1 ]
Gomez-Verdejo, Vanessa [2 ]
Figueiras-Vidal, Anibal R. [2 ]
机构
[1] Univ Catolica San Pablo, Arequipa, Peru
[2] Univ Carlos III Madrid, Dept Signal Theory & Commun, Madrid 28911, Spain
关键词
Real AdaBoost; Subsampling; Support Vector Machines; Linear programming; Ensemble classifiers; NEURAL-NETWORKS; CLASSIFICATION; ADABOOST; ALGORITHMS; ENSEMBLES; MODEL;
D O I
10.1016/j.inffus.2014.10.005
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Boosting algorithms pay attention to the particular structure of the training data when learning, by means of iteratively emphasizing the importance of the training samples according to their difficulty for being correctly classified. If common kernel Support Vector Machines (SVMs) are used as basic learners to construct a Real AdaBoost ensemble, the resulting ensemble can be easily compacted into a monolithic architecture by simply combining the weights that correspond to the same kernels when they appear in different learners, avoiding to increase the operation computational effort for the above potential advantage. This way, the performance advantage that boosting provides can be obtained for monolithic SVMs, i.e., without paying in classification computational effort because many learners are needed. However, SVMs are both stable and strong, and their use for boosting requires to unstabilize and to weaken them. Yet previous attempts in this direction show a moderate success. In this paper, we propose a combination of a new and appropriately designed subsampling process and an SVM algorithm which permits sparsity control to solve the difficulties in boosting SVMs for obtaining improved performance designs. Experimental results support the effectiveness of the approach, not only in performance, but also in compactness of the resulting classifiers, as well as that combining both design ideas is needed to arrive to these advantageous designs. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:63 / 71
页数:9
相关论文
共 63 条
[11]   Designing Model Based Classifiers by Emphasizing Soft Targets [J].
El Jelali, Soufiane ;
Lyhyaoui, Abdelouahid ;
Figueiras-Vidal, Anibal R. .
FUNDAMENTA INFORMATICAE, 2009, 96 (04) :419-433
[12]   A decision-theoretic generalization of on-line learning and an application to boosting [J].
Freund, Y ;
Schapire, RE .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 55 (01) :119-139
[13]   Boosting by weighting critical and erroneous samples [J].
Gómez-Verdejo, V ;
Ortega-Moral, M ;
Arenas-García, J ;
Figueiras-Vidal, AR .
NEUROCOMPUTING, 2006, 69 (7-9) :679-685
[14]   A dynamically adjusted mixed emphasis method for building boosting ensembles [J].
Gomez-Verdejo, Vanessa ;
Arenas-Garcia, Jeronimo ;
Figueiras-Vidal, Anibal R. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2008, 19 (01) :3-17
[15]  
HERBRICH R., 2001, Learning Kernel Classifiers: Theory and Algorithms
[16]   Multitask multiclass support vector machines: Model and experiments [J].
Ji, You ;
Sun, Shiliang .
PATTERN RECOGNITION, 2013, 46 (03) :914-924
[17]  
Kalal Z., 2008, BMVC
[18]   Boosting Method for Local Learning in Statistical Pattern Recognition [J].
Kawakita, Masanori ;
Eguchi, Shinto .
NEURAL COMPUTATION, 2008, 20 (11) :2792-2838
[19]   Support vectors selection by linear programming [J].
Kecman, V ;
Hadzic, I .
IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL V, 2000, :193-198
[20]   Constructing support vector machine ensemble [J].
Kim, HC ;
Pang, S ;
Je, HM ;
Kim, D ;
Bang, SY .
PATTERN RECOGNITION, 2003, 36 (12) :2757-2767