Boosting k-nearest neighbor classifier by means of input space projection

被引:80
|
作者
Garcia-Pedrajas, Nicolas [1 ]
Ortiz-Boyer, Domingo [1 ]
机构
[1] Univ Cordoba, Dept Comp & Numer Anal, E-14071 Cordoba, Spain
关键词
k-Nearest neighbors; Boosting; Subspace methods; TESTS;
D O I
10.1016/j.eswa.2009.02.065
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The k-nearest neighbors classifier is one of the most widely used methods of classification due to several interesting features, such as good generalization and easy implementation. Although simple, it is usually able to match, and even beat, more sophisticated and complex methods. However, no successful method has been reported so far to apply boosting to k-NN. As boosting methods have proved very effective in improving the generalization capabilities of many classification algorithms, proposing an appropriate application of boosting to k-nearest neighbors is of great interest. Ensemble methods rely on the instability of the classifiers to improve their performance, as k-NN is fairly stable with respect to resampling, these methods fail in their attempt to improve the performance of k-NN classifier. On the other hand, k-NN is very sensitive to input selection. In this way, ensembles based on subspace methods are able to improve the performance of single k-NN classifiers. In this paper we make use of the sensitivity of k-NN to input space for developing two methods for boosting k-NN. The two approaches modify the view of the data that each classifier receives so that the accurate classification of difficult instances is favored. The two approaches are compared with the classifier alone and bagging and random subspace methods with a marked and significant improvement of the generalization error. The comparison is performed using a large test set of 45 problems from the UCI Machine Learning Repository. A further study on noise tolerance shows that the proposed methods are less affected by class label noise than the standard methods. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:10570 / 10582
页数:13
相关论文
共 50 条
  • [1] A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric
    Neo, Toh Koon Charlie
    Ventura, Dan
    PATTERN RECOGNITION LETTERS, 2012, 33 (01) : 92 - 102
  • [2] K-Nearest Neighbor Search by Random Projection Forests
    Yan, Donghui
    Wang, Yingjie
    Wang, Jin
    Wang, Honggang
    Li, Zhenpeng
    IEEE TRANSACTIONS ON BIG DATA, 2021, 7 (01) : 147 - 157
  • [3] K-nearest Neighbor Search by Random Projection Forests
    Yan, Donghui
    Wang, Yingjie
    Wang, Jin
    Wang, Honggang
    Li, Zhenpeng
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 4775 - 4781
  • [4] A fuzzy K-nearest neighbor classifier to deal with imperfect data
    Cadenas, Jose M.
    Carmen Garrido, M.
    Martinez, Raquel
    Munoz, Enrique
    Bonissone, Piero P.
    SOFT COMPUTING, 2018, 22 (10) : 3313 - 3330
  • [5] A new globally adaptive k-nearest neighbor classifier based on local mean optimization
    Pan, Zhibin
    Pan, Yiwei
    Wang, Yidi
    Wang, Wei
    SOFT COMPUTING, 2021, 25 (03) : 2417 - 2431
  • [6] Asymptotics of k-nearest Neighbor Riesz Energies
    Hardin, Douglas P.
    Saff, Edward B.
    Vlasiuk, Oleksandr
    CONSTRUCTIVE APPROXIMATION, 2024, 59 (02) : 333 - 383
  • [7] Kinetic Reverse k-Nearest Neighbor Problem
    Rahmati, Zahed
    King, Valerie
    Whitesides, Sue
    COMBINATORIAL ALGORITHMS, IWOCA 2014, 2015, 8986 : 307 - 317
  • [8] K-nearest neighbor finding using MaxNearestDist
    Samet, Hanan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2008, 30 (02) : 243 - 252
  • [9] Fast k-Nearest Neighbor Searching in Static Objects
    Jae Moon Lee
    Wireless Personal Communications, 2017, 93 : 147 - 160
  • [10] Fast k-Nearest Neighbor Searching in Static Objects
    Lee, Jae Moon
    WIRELESS PERSONAL COMMUNICATIONS, 2017, 93 (01) : 147 - 160