Boosting k-nearest neighbor classifier by means of input space projection

被引:80
|
作者
Garcia-Pedrajas, Nicolas [1 ]
Ortiz-Boyer, Domingo [1 ]
机构
[1] Univ Cordoba, Dept Comp & Numer Anal, E-14071 Cordoba, Spain
关键词
k-Nearest neighbors; Boosting; Subspace methods; TESTS;
D O I
10.1016/j.eswa.2009.02.065
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The k-nearest neighbors classifier is one of the most widely used methods of classification due to several interesting features, such as good generalization and easy implementation. Although simple, it is usually able to match, and even beat, more sophisticated and complex methods. However, no successful method has been reported so far to apply boosting to k-NN. As boosting methods have proved very effective in improving the generalization capabilities of many classification algorithms, proposing an appropriate application of boosting to k-nearest neighbors is of great interest. Ensemble methods rely on the instability of the classifiers to improve their performance, as k-NN is fairly stable with respect to resampling, these methods fail in their attempt to improve the performance of k-NN classifier. On the other hand, k-NN is very sensitive to input selection. In this way, ensembles based on subspace methods are able to improve the performance of single k-NN classifiers. In this paper we make use of the sensitivity of k-NN to input space for developing two methods for boosting k-NN. The two approaches modify the view of the data that each classifier receives so that the accurate classification of difficult instances is favored. The two approaches are compared with the classifier alone and bagging and random subspace methods with a marked and significant improvement of the generalization error. The comparison is performed using a large test set of 45 problems from the UCI Machine Learning Repository. A further study on noise tolerance shows that the proposed methods are less affected by class label noise than the standard methods. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:10570 / 10582
页数:13
相关论文
共 50 条
  • [31] Radar Target Detection with K-Nearest Neighbor Manifold Filter on Riemannian Manifold
    Zhou, Dongao
    Yang, Weilong
    Liu, Zhaopeng
    Sun, Manhui
    ADVANCES IN MATHEMATICAL PHYSICS, 2024, 2024
  • [32] Classification of Toddler Nutrition Status with Anthropometry Using The K-Nearest Neighbor Method
    Sendari, Siti
    Widyaningtyas, Triyanna
    Maulidia, Nur Amelia
    2019 INTERNATIONAL CONFERENCE ON ELECTRICAL, ELECTRONICS AND INFORMATION ENGINEERING (ICEEIE), 2019, : 154 - 158
  • [33] An efficient regularized K-nearest neighbor structural twin support vector machine
    Fan Xie
    Yitian Xu
    Applied Intelligence, 2019, 49 : 4258 - 4275
  • [34] Synergizing the enhanced RIME with fuzzy K-nearest neighbor for diagnose of pulmonary hypertension
    Yu, Xiaoming
    Qin, Wenxiang
    Lin, Xiao
    Shan, Zhuohan
    Huang, Liyao
    Shao, Qike
    Wang, Liangxing
    Chen, Mayun
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 165
  • [35] Classification of Lung Nodules based on Transfer Learning with K-Nearest Neighbor (KNN)
    Saikial, Trishna
    Hansdahl, Malho
    Singh, Koushlendra Kumar
    Bajpai, Manish Kumar
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES (IST 2022), 2022,
  • [36] Local generalized quadratic distance metrics: application to the k-nearest neighbors classifier
    Abou-Moustafa, Karim
    Ferrie, Frank P.
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2018, 12 (02) : 341 - 363
  • [37] Hybrid NaIve Bayes K-Nearest Neighbor Method Implementation on Speech Emotion Recognition
    Leo, Seho
    2015 IEEE ADVANCED INFORMATION TECHNOLOGY, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (IAEAC), 2015, : 349 - 353
  • [38] A Road Network Embedding Technique for K-Nearest Neighbor Search in Moving Object Databases
    Cyrus Shahabi
    Mohammad R. Kolahdouzan
    Mehdi Sharifzadeh
    GeoInformatica, 2003, 7 : 255 - 273
  • [39] Network Transmission Flags Data Affinity-based Classification by K-Nearest Neighbor
    Aljojo, Nahla
    ARO-THE SCIENTIFIC JOURNAL OF KOYA UNIVERSITY, 2022, 10 (01): : 35 - 43
  • [40] K-Nearest Neighbor Nonnegative Matrix Factorization for Learning a Mixture of Local SOM Models
    Nova, David
    Estevez, Pablo A.
    Huijse, Pablo
    ADVANCES IN SELF-ORGANIZING MAPS AND LEARNING VECTOR QUANTIZATION, 2014, 295 : 229 - 238