A comparative study on large scale kernelized support vector machines

被引:0
作者
Daniel Horn
Aydın Demircioğlu
Bernd Bischl
Tobias Glasmachers
Claus Weihs
机构
[1] Technische Universität Dortmund,Fakultät Statistik
[2] Ruhr-Universität Bochum,Department of Statistics
[3] LMU München,undefined
来源
Advances in Data Analysis and Classification | 2018年 / 12卷
关键词
Support vector machine; Multi-objective optimization; Supervised learning; Machine learning; Large scale; Nonlinear SVM; Parameter tuning; 62-07 Data analysis;
D O I
暂无
中图分类号
学科分类号
摘要
Kernelized support vector machines (SVMs) belong to the most widely used classification methods. However, in contrast to linear SVMs, the computation time required to train such a machine becomes a bottleneck when facing large data sets. In order to mitigate this shortcoming of kernel SVMs, many approximate training algorithms were developed. While most of these methods claim to be much faster than the state-of-the-art solver LIBSVM, a thorough comparative study is missing. We aim to fill this gap. We choose several well-known approximate SVM solvers and compare their performance on a number of large benchmark data sets. Our focus is to analyze the trade-off between prediction error and runtime for different learning and accuracy parameter settings. This includes simple subsampling of the data, the poor-man’s approach to handling large scale problems. We employ model-based multi-objective optimization, which allows us to tune the parameters of learning machine and solver over the full range of accuracy/runtime trade-offs. We analyze (differences between) solvers by studying and comparing the Pareto fronts formed by the two objectives classification error and training time. Unsurprisingly, given more runtime most solvers are able to find more accurate solutions, i.e., achieve a higher prediction accuracy. It turns out that LIBSVM with subsampling of the data is a strong baseline. Some solvers systematically outperform others, which allows us to give concrete recommendations of when to use which solver.
引用
收藏
页码:867 / 883
页数:16
相关论文
共 50 条
[41]   Density-induced margin support vector machines [J].
Zhang, Li ;
Zhou, Wei-Da .
PATTERN RECOGNITION, 2011, 44 (07) :1448-1460
[42]   Analyzing superstars' power using support vector machines [J].
Suarez-Vazquez, Ana ;
Quevedo, Jose R. .
EMPIRICAL ECONOMICS, 2015, 49 (04) :1521-1542
[43]   Hybrid SORN Hardware Accelerator for Support Vector Machines [J].
Huelsmeier, Nils ;
Baerthel, Moritz ;
Rust, Jochen ;
Paul, Steffen .
NEXT GENERATION ARITHMETIC, CONGA 2023, 2023, 13851 :77-87
[44]   Gesture phase segmentation using support vector machines [J].
Barros Madeo, Renata Cristina ;
Peres, Sarajane Marques ;
de Moraes Lima, Clodoaldo Aparecido .
EXPERT SYSTEMS WITH APPLICATIONS, 2016, 56 :100-115
[45]   Real-Part Quantum Support Vector Machines [J].
Piatkowski, Nico ;
Muecke, Sascha .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK AND DEMO TRACK, PT VIII, ECML PKDD 2024, 2024, 14948 :144-160
[46]   An Algebraic Approach to Clustering and Classification with Support Vector Machines [J].
Arslan, Guvenc ;
Madran, Ugur ;
Soyoglu, Duygu .
MATHEMATICS, 2022, 10 (01)
[47]   Analyzing superstars’ power using support vector machines [J].
Ana Suárez-Vázquez ;
José R. Quevedo .
Empirical Economics, 2015, 49 :1521-1542
[48]   USING SUPPORT VECTOR MACHINES FOR ANOMALOUS CHANGE DETECTION [J].
Steinwart, Ingo ;
Theiler, James ;
Llamocca, Daniel .
2010 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2010, :3732-3735
[49]   Comparative classification study of toxicity mechanisms using support vector machines and radial basis function neural networks [J].
Yao, XJ ;
Panaye, A ;
Doucet, JP ;
Chen, HF ;
Zhang, RS ;
Fan, BT ;
Liu, MC ;
Hu, ZD .
ANALYTICA CHIMICA ACTA, 2005, 535 (1-2) :259-273
[50]   A comparative study of forecasting corporate credit ratings using neural networks, support vector machines, and decision trees [J].
Golbayani, Parisa ;
Florescu, Ionut ;
Chatterjee, Rupak .
NORTH AMERICAN JOURNAL OF ECONOMICS AND FINANCE, 2020, 54