Deep multilayer multiple kernel learning

被引:29
作者
Rebai, Ilyes [1 ]
BenAyed, Yassine [1 ]
Mahdi, Walid [2 ]
机构
[1] Univ Sfax, MIRACL Multimedia InfoRmat Syst & Adv Comp Lab, Sfax, Tunisia
[2] Taif Univ, Coll Comp & Informat Technol, At Taif, Saudi Arabia
关键词
Deep learning; Support vector machine; Multilayer multiple kernel learning; Optimization methods; Gradient ascent;
D O I
10.1007/s00521-015-2066-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiple kernel learning (MKL) approach has been proposed for kernel methods and has shown high performance for solving some real-world applications. It consists on learning the optimal kernel from one layer of multiple predefined kernels. Unfortunately, this approach is not rich enough to solve relatively complex problems. With the emergence and the success of the deep learning concept, multilayer of multiple kernel learning (MLMKL) methods were inspired by the idea of deep architecture. They are introduced in order to improve the conventional MKL methods. Such architectures tend to learn deep kernel machines by exploring the combinations of multiple kernels in a multilayer structure. However, existing MLMKL methods often have trouble with the optimization of the network for two or more layers. Additionally, they do not always outperform the simplest method of combining multiple kernels (i.e., MKL). In order to improve the effectiveness of MKL approaches, we introduce, in this paper, a novel backpropagation MLMKL framework. Specifically, we propose to optimize the network over an adaptive backpropagation algorithm. We use the gradient ascent method instead of dual objective function, or the estimation of the leave-one-out error. We test our proposed method through a large set of experiments on a variety of benchmark data sets. We have successfully optimized the system over many layers. Empirical results over an extensive set of experiments show that our algorithm achieves high performance compared to the traditional MKL approach and existing MLMKL methods.
引用
收藏
页码:2305 / 2314
页数:10
相关论文
共 30 条
  • [1] [Anonymous], P INTERSPEECH
  • [2] [Anonymous], 2011, AIStats
  • [3] [Anonymous], 25 BEN ART INT C
  • [4] [Anonymous], 2009, Proceedings of the 26th Annual International Conference on Machine Learning, DOI DOI 10.1145/1553374.1553510
  • [5] [Anonymous], 2009, Advances in Neural Information Processing Systems 21 (NIPS)
  • [6] Bach F. R., 2004, P 21 INT C MACH LEAR, DOI 10.1145/ 1015330.1015424
  • [7] LIBSVM: A Library for Support Vector Machines
    Chang, Chih-Chung
    Lin, Chih-Jen
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
  • [8] Chen G., 2014, ICASSP, P4087, DOI DOI 10.1109/ICASSP.2014.6854370
  • [9] Cho Youngmin, 2012, THESIS
  • [10] Cho Youngmin, 2009, ADV NEURAL INFORM PR, V22