Optimally regularised kernel Fisher discriminant classification

被引:18
|
作者
Saadi, Kamel [1 ]
Talbot, Nicola L. C. [1 ]
Cawley, Gavin C. [1 ]
机构
[1] Univ E Anglia, Sch Comp Sci, Norwich NR4 7TJ, Norfolk, England
基金
英国生物技术与生命科学研究理事会;
关键词
model selection; cross-validation; least-squares support vector machine;
D O I
10.1016/j.neunet.2007.05.005
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mika, Ratsch, Weston, Scholkopf and Muller [Mika, S., Ratsch, G.. Weston, J., Scholkopf, B., & Muller, K.-R. (1999). Fisher discriminant analysis with kernels. In Neural networks for signal processing: Vol. IX (pp. 41-48). New York: IEEE Press] introduce a non-linear formulation of Fisher's linear discriminant, based on the now familiar "kernel trick", demonstrating state-of-the-art performance on a wide range of real-world benchmark datasets. In this paper, we extend an existing analytical expression for the leave-one-out cross-validation error [Cawley, G. C., & Talbot, N. L. C. (2003b). Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers. Pattern Recognition, 36(11), 2585-2592] such that the leave-one-out error can be re-estimated following a change in the value of the regularisation parameter with a computational complexity of only O(l(2)) operations, which is substantially less than the O(l(3)) operations required for the basic training algorithm. This allows the regularisation parameter to be tuned at an essentially negligible computational cost. This is achieved by performing the discriminant analysis in canonical form. The proposed method is therefore a useful component of a model selection strategy for this class of kernel machines that alternates between updates of the kernel and regularisation parameters. Results obtained on real-world and synthetic benchmark datasets indicate that the proposed method is competitive with model selection based on k-fold cross-validation in terms of generalisation, whilst being considerably faster. (C) 2007 Elsevier Ltd. All rights reserved.
引用
收藏
页码:832 / 841
页数:10
相关论文
共 50 条
  • [41] An improved kernel fisher discriminant classifier and its applications
    Gao, DQ
    Wang, Z
    Li, YL
    Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vols 1-5, 2005, : 1274 - 1279
  • [42] Input variable selection in kernel Fisher discriminant analysis
    Louw, N
    Steel, SJ
    FROM DATA AND INFORMATION ANALYSIS TO KNOWLEDGE ENGINEERING, 2006, : 126 - +
  • [43] Identification of Influential Cases in Kernel Fisher Discriminant Analysis
    Louw, Nelmarie
    Lamont, Morne M. C.
    Steel, Sarel J.
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2008, 37 (10) : 2050 - 2062
  • [44] Kernel Fisher discriminant analysis embedded with feature selection
    Wang, Yong-Qiao
    PROCEEDINGS OF 2007 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2007, : 1160 - 1165
  • [45] On the equivalence of Kernel Fisher discriminant analysis and Kernel Quadratic Programming Feature Selection
    Rodriguez-Lujan, I.
    Santa Cruz, C.
    Huerta, R.
    PATTERN RECOGNITION LETTERS, 2011, 32 (11) : 1567 - 1571
  • [46] Classification of soil and vegetation by kernel Fisher and kernel PCA
    Chapron M.
    Bain G.
    Pattern Recognition and Image Analysis, 2011, 21 (3) : 462 - 466
  • [47] Classification of soil and vegetation by kernel fisher and kernel PCA
    Chapron M.
    Pattern Recognition and Image Analysis, 2013, 23 (1) : 51 - 56
  • [48] Object-oriented Classification of Polarimetric SAR Imagery based on Kernel Fisher Discriminant Dimensionality Reduction
    Cao, Han
    Zhang, Hong
    Wang, Chao
    Liu, Meng
    Wu, Fan
    11TH EUROPEAN CONFERENCE ON SYNTHETIC APERTURE RADAR (EUSAR 2016), 2016, : 440 - 443
  • [50] Wavelet Kernel Local Fisher Discriminant Analysis With Particle Swarm Optimization Algorithm for Bearing Defect Classification
    Van, Mien
    Kang, Hee-Jun
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2015, 64 (12) : 3588 - 3600