Multiple Kernel Learning with Gaussianity Measures

被引:5
作者
Hino, Hideitsu [1 ]
Reyhani, Nima [2 ]
Murata, Noboru [1 ]
机构
[1] Waseda Univ, Sch Sci & Engn, Shinjuku Ku, Tokyo 1698555, Japan
[2] Aalto Univ, Sch Sci, FI-00076 Aalto, Finland
关键词
EMPIRICAL CHARACTERISTIC FUNCTION;
D O I
10.1162/NECO_a_00299
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel methods are known to be effective for nonlinear multivariate analysis. One of the main issues in the practical use of kernel methods is the selection of kernel. There have been a lot of studies on kernel selection and kernel learning. Multiple kernel learning (MKL) is one of the promising kernel optimization approaches. Kernel methods are applied to various classifiers including Fisher discriminant analysis (FDA). FDA gives the Bayes optimal classification axis if the data distribution of each class in the feature space is a gaussian with a shared covariance structure. Based on this fact, an MKL framework based on the notion of gaussianity is proposed. As a concrete implementation, an empirical characteristic function is adopted to measure gaussianity in the feature space associated with a convex combination of kernel functions, and two MKL algorithms are derived. From experimental results on some data sets, we show that the proposed kernel learning followed by FDA offers strong classification power.
引用
收藏
页码:1853 / 1881
页数:29
相关论文
共 40 条
[1]   Improving support vector machine classifiers by modifying kernel functions [J].
Amari, S ;
Wu, S .
NEURAL NETWORKS, 1999, 12 (06) :783-789
[2]  
[Anonymous], 2004, INT C MACH LEARN
[3]  
[Anonymous], 2010, R LANG ENV STAT COMP
[4]  
[Anonymous], P INT WORKSH IND COM
[5]  
[Anonymous], 2004, KERNEL METHODS PATTE
[6]  
[Anonymous], ADV NEURAL INFORM PR
[7]  
[Anonymous], 1973, Pattern Classification and Scene Analysis
[8]  
Bennett K.P., 2002, P 8 ACM SIGKDD INT C, P24, DOI DOI 10.1145/775047.775051
[9]  
Bi J., 2004, ACM SIGKDD INT C KNO, P521
[10]  
Christianini N., 2000, INTRO SUPPORT VECTOR, P189