Fuzzy KNN Method With Adaptive Nearest Neighbors

被引:32
作者
Bian, Zekang [1 ,2 ]
Vong, Chi Man [3 ]
Wong, Pak Kin [4 ]
Wang, Shitong [1 ,2 ]
机构
[1] Jiangnan Univ, Sch Digital Media, Wuxi 214122, Jiangsu, Peoples R China
[2] Jiangnan Univ, Jiangsu Prov Key Lab Media Design & Software Tech, Wuxi 214122, Jiangsu, Peoples R China
[3] Univ Macau, Dept Comp & Informat Sci, Fac Sci & Technol, Macau, Peoples R China
[4] Univ Macau, Fac Sci & Technol, Dept Elect Engn, Macau, Peoples R China
基金
中国国家自然科学基金;
关键词
Decision tree; fuzzy k-nearest-neighbor method (FKNN); nearest neighbors; sparse representation/reconstruction; RANDOM FOREST; ALGORITHM; SELECTION; CLASSIFICATION; COMPLEXITY; SHRINKAGE;
D O I
10.1109/TCYB.2020.3031610
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to its strong performance in handling uncertain and ambiguous data, the fuzzy k-nearest-neighbor method (FKNN) has realized substantial success in a wide variety of applications. However, its classification performance would be heavily deteriorated if the number k of nearest neighbors was unsuitably fixed for each testing sample. This study examines the feasibility of using only one fixed k value for FKNN on each testing sample. A novel FKNN-based classification method, namely, fuzzy KNN method with adaptive nearest neighbors (A-FKNN), is devised for learning a distinct optimal k value for each testing sample. In the training stage, after applying a sparse representation method on all training samples for reconstruction, A-FKNN learns the optimal k value for each training sample and builds a decision tree (namely, A-FKNN tree) from all training samples with new labels (the learned optimal k values instead of the original labels), in which each leaf node stores the corresponding optimal k value. In the testing stage, A-FKNN identifies the optimal k value for each testing sample by searching the A-FKNN tree and runs FKNN with the optimal k value for each testing sample. Moreover, a fast version of A-FKNN, namely, FA-FKNN, is designed by building the FA-FKNN decision tree, which stores the optimal k value with only a subset of training samples in each leaf node. Experimental results on 32 UCI datasets demonstrate that both A-FKNN and FA-FKNN outperform the compared methods in terms of classification accuracy, and FA-FKNN has a shorter running time.
引用
收藏
页码:5380 / 5393
页数:14
相关论文
共 56 条
  • [1] A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
    Beck, Amir
    Teboulle, Marc
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01): : 183 - 202
  • [2] Bian Z., INT J MACH LEARN CYB
  • [3] Complexity measures and decision tree complexity: a survey
    Buhrman, H
    de Wolf, R
    [J]. THEORETICAL COMPUTER SCIENCE, 2002, 288 (01) : 21 - 43
  • [4] κ NN algorithm with data-driven k value
    [J]. Zhang, Shichao, 1600, Springer Verlag (8933): : 499 - 512
  • [5] Semi-Supervised SVM With Extended Hidden Features
    Dong, Aimei
    Chung, Fu-Lai
    Deng, Zhaohong
    Wang, Shitong
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (12) : 2924 - 2937
  • [6] A Novel Template Reduction Approach for the K-Nearest Neighbor Method
    Fayed, Hatem A.
    Atiya, Amir F.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (05): : 890 - 896
  • [7] Fuzzy Broad Learning System: A Novel Neuro-Fuzzy Model for Regression and Classification
    Feng, Shuang
    Chen, C. L. Philip
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (02) : 414 - 424
  • [8] Góra G, 2002, LECT NOTES ARTIF INT, V2430, P111
  • [9] Bayesian Takagi-Sugeno-Kang Fuzzy Classifier
    Gu, Xiaoqing
    Chung, Fu-Lai
    Wang, Shitong
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2017, 25 (06) : 1655 - 1671
  • [10] Guo GD, 2003, LECT NOTES COMPUT SC, V2888, P986