Boosting k-nearest neighbor classifier by means of input space projection

被引:80
|
作者
Garcia-Pedrajas, Nicolas [1 ]
Ortiz-Boyer, Domingo [1 ]
机构
[1] Univ Cordoba, Dept Comp & Numer Anal, E-14071 Cordoba, Spain
关键词
k-Nearest neighbors; Boosting; Subspace methods; TESTS;
D O I
10.1016/j.eswa.2009.02.065
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The k-nearest neighbors classifier is one of the most widely used methods of classification due to several interesting features, such as good generalization and easy implementation. Although simple, it is usually able to match, and even beat, more sophisticated and complex methods. However, no successful method has been reported so far to apply boosting to k-NN. As boosting methods have proved very effective in improving the generalization capabilities of many classification algorithms, proposing an appropriate application of boosting to k-nearest neighbors is of great interest. Ensemble methods rely on the instability of the classifiers to improve their performance, as k-NN is fairly stable with respect to resampling, these methods fail in their attempt to improve the performance of k-NN classifier. On the other hand, k-NN is very sensitive to input selection. In this way, ensembles based on subspace methods are able to improve the performance of single k-NN classifiers. In this paper we make use of the sensitivity of k-NN to input space for developing two methods for boosting k-NN. The two approaches modify the view of the data that each classifier receives so that the accurate classification of difficult instances is favored. The two approaches are compared with the classifier alone and bagging and random subspace methods with a marked and significant improvement of the generalization error. The comparison is performed using a large test set of 45 problems from the UCI Machine Learning Repository. A further study on noise tolerance shows that the proposed methods are less affected by class label noise than the standard methods. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:10570 / 10582
页数:13
相关论文
共 50 条
  • [21] An optimized K-Nearest Neighbor algorithm based on Dynamic Distance approach
    Sadrabadi, Aireza Naser
    Znjirchi, Seyed Mahmood
    Abadi, Habib Zare Ahmad
    Hajimoradi, Ahmad
    2020 6TH IRANIAN CONFERENCE ON SIGNAL PROCESSING AND INTELLIGENT SYSTEMS (ICSPIS), 2020,
  • [22] Scalable processing of continuous K-nearest neighbor queries with uncertain velocity
    Lin, Lien-Fa
    Huang, Yuan-Ko
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (08) : 9256 - 9265
  • [23] Forecasting and Avoiding Student Dropout Using the K-Nearest Neighbor Approach
    Mardolkar M.
    Kumaran N.
    SN Computer Science, 2020, 1 (2)
  • [24] Identification of model order and number of neighbors for k-nearest neighbor resampling
    Lee, Taesam
    Ouarda, Taha B. M. J.
    JOURNAL OF HYDROLOGY, 2011, 404 (3-4) : 136 - 145
  • [25] K-nearest neighbor based structural twin support vector machine
    Pan, Xianli
    Luo, Yao
    Xu, Yitian
    KNOWLEDGE-BASED SYSTEMS, 2015, 88 : 34 - 44
  • [26] Evaluating continuous K-nearest neighbor query on moving objects with uncertainty
    Huang, Yuan-Ko
    Liao, Shi-Jei
    Lee, Chiang
    INFORMATION SYSTEMS, 2009, 34 (4-5) : 415 - 437
  • [27] K-Nearest Neighbors Classifier for Field Bit Error Rate Data
    Allogba, Stephanie
    Tremblay, Christine
    2018 ASIA COMMUNICATIONS AND PHOTONICS CONFERENCE (ACP), 2018,
  • [28] Efficient k-nearest neighbors search in graph space
    Abu-Aisheh, Zeina
    Raveaux, Romain
    Ramel, Jean-Yves
    PATTERN RECOGNITION LETTERS, 2020, 134 (134) : 77 - 86
  • [29] An efficient regularized K-nearest neighbor structural twin support vector machine
    Xie, Fan
    Xu, Yitian
    APPLIED INTELLIGENCE, 2019, 49 (12) : 4258 - 4275
  • [30] User Classification Based on Mouse Dynamics Authentication using K-Nearest Neighbor
    Chandranegara, Didih Rizki
    Ashari, Anzilludin
    Sari, Zamah
    Wibowo, Hardianto
    Suharso, Wildan
    MAKARA JOURNAL OF TECHNOLOGY, 2023, 27 (01):