Generalized SMO Algorithm for SVM-Based Multitask Learning

被引:56
作者
Cai, Feng [1 ]
Cherkassky, Vladimir [1 ]
机构
[1] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55455 USA
基金
美国国家科学基金会;
关键词
Classification; learning with structured data; multitask learning; quadratic optimization; sequential minimal optimization; support vector machine (SVM); SVM;
D O I
10.1109/TNNLS.2012.2187307
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Exploiting additional information to improve traditional inductive learning is an active research area in machine learning. In many supervised-learning applications, training data can be naturally separated into several groups, and incorporating this group information into learning may improve generalization. Recently, Vapnik proposed a general approach to formalizing such problems, known as "learning with structured data" and its support vector machine (SVM) based optimization formulation called SVM+. Liang and Cherkassky showed the connection between SVM+ and multitask learning (MTL) approaches in machine learning, and proposed an SVM-based formulation for MTL called SVM+ MTL for classification. Training the SVM+ MTL classifier requires the solution of a large quadratic programming optimization problem which scales as O(n(3)) with sample size n. So there is a need to develop computationally efficient algorithms for implementing SVM+ MTL. This brief generalizes Platt's sequential minimal optimization (SMO) algorithm to the SVM+ MTL setting. Empirical results show that, for typical SVM+ MTL problems, the proposed generalized SMO achieves over 100 times speed-up, in comparison with general-purpose optimization routines.
引用
收藏
页码:997 / 1003
页数:7
相关论文
共 21 条
[21]  
Xue Y, 2007, J MACH LEARN RES, V8, P35