An empirical assessment on the robustness of support vector regression with different kernels

被引:0
作者
Liu, JX [1 ]
Li, J [1 ]
Tan, YJ [1 ]
机构
[1] Natl Univ Def Technol, Dept Informat Syst & Management, Changsha 410073, Peoples R China
来源
Proceedings of 2005 International Conference on Machine Learning and Cybernetics, Vols 1-9 | 2005年
关键词
robustness; kernel function; composition of kernels; support vector regression;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The choice of kernels is important for the support vector regression (SVR). In this paper, the robustness of SVR with different kernels is empirically analyzed. The two typical kernels, polynomial kernel and Radial Basis Function (RBF) kernel, and their hybrid are used. Two simple rules for composition of kernels are used to produce the hybrid kernels. The experimental results show that the SVR with different scale kernels of the same type has different performance. The SVR using the polynomial kernel with lower degree are more robust than that using RBF kernel with narrower parameter. The hybrid of different types or scales of kernels can improve the robustness to some extent. Furthermore, two benchmark datasets are used to verify the results.
引用
收藏
页码:4289 / 4294
页数:6
相关论文
共 15 条
[1]  
[Anonymous], 2002, P 2002 INT JOINT C N
[2]   On domain knowledge and feature selection using a support vector machine [J].
Barzilay, O ;
Brailovsky, VL .
PATTERN RECOGNITION LETTERS, 1999, 20 (05) :475-484
[3]   On global, local, mixed and neighborhood kernels for support vector machines [J].
Brailovsky, VL ;
Barzilay, O ;
Shahave, R .
PATTERN RECOGNITION LETTERS, 1999, 20 (11-13) :1183-1190
[4]  
Burges CJC, 1999, ADVANCES IN KERNEL METHODS, P89
[5]  
CHANG CC, LIBSVM LIBR SUPPORT
[6]  
CHAPELLE O, 2000, NIPS2000 WORKSH LEAR
[7]  
Deng N. Y., 2004, NEW METHOD DATA MINI
[8]  
ERIC WM, 2004, IEEE T SYSTEMS MAN B, V34
[9]  
Scholkopf B, 1998, ADV NEUR IN, V10, P640
[10]  
SHAO XH, 1999, INT JOINT C NEUR NET, P1065