Fuzzy KNN Method With Adaptive Nearest Neighbors

被引:36
作者
Bian, Zekang [1 ,2 ]
Vong, Chi Man [3 ]
Wong, Pak Kin [4 ]
Wang, Shitong [1 ,2 ]
机构
[1] Jiangnan Univ, Sch Digital Media, Wuxi 214122, Jiangsu, Peoples R China
[2] Jiangnan Univ, Jiangsu Prov Key Lab Media Design & Software Tech, Wuxi 214122, Jiangsu, Peoples R China
[3] Univ Macau, Dept Comp & Informat Sci, Fac Sci & Technol, Macau, Peoples R China
[4] Univ Macau, Fac Sci & Technol, Dept Elect Engn, Macau, Peoples R China
基金
中国国家自然科学基金;
关键词
Decision tree; fuzzy k-nearest-neighbor method (FKNN); nearest neighbors; sparse representation/reconstruction; RANDOM FOREST; ALGORITHM; SELECTION; CLASSIFICATION; COMPLEXITY; SHRINKAGE;
D O I
10.1109/TCYB.2020.3031610
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to its strong performance in handling uncertain and ambiguous data, the fuzzy k-nearest-neighbor method (FKNN) has realized substantial success in a wide variety of applications. However, its classification performance would be heavily deteriorated if the number k of nearest neighbors was unsuitably fixed for each testing sample. This study examines the feasibility of using only one fixed k value for FKNN on each testing sample. A novel FKNN-based classification method, namely, fuzzy KNN method with adaptive nearest neighbors (A-FKNN), is devised for learning a distinct optimal k value for each testing sample. In the training stage, after applying a sparse representation method on all training samples for reconstruction, A-FKNN learns the optimal k value for each training sample and builds a decision tree (namely, A-FKNN tree) from all training samples with new labels (the learned optimal k values instead of the original labels), in which each leaf node stores the corresponding optimal k value. In the testing stage, A-FKNN identifies the optimal k value for each testing sample by searching the A-FKNN tree and runs FKNN with the optimal k value for each testing sample. Moreover, a fast version of A-FKNN, namely, FA-FKNN, is designed by building the FA-FKNN decision tree, which stores the optimal k value with only a subset of training samples in each leaf node. Experimental results on 32 UCI datasets demonstrate that both A-FKNN and FA-FKNN outperform the compared methods in terms of classification accuracy, and FA-FKNN has a shorter running time.
引用
收藏
页码:5380 / 5393
页数:14
相关论文
共 56 条
[1]  
[Anonymous], 2013, P ACM INT C MULT, DOI DOI 10.1145/2502081.2502107
[2]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[3]  
Bian Z., INT J MACH LEARN CYB
[4]   Complexity measures and decision tree complexity: a survey [J].
Buhrman, H ;
de Wolf, R .
THEORETICAL COMPUTER SCIENCE, 2002, 288 (01) :21-43
[5]   κ NN algorithm with data-driven k value [J].
Cheng, Debo ;
Zhang, Shichao ;
Deng, Zhenyun ;
Zhu, Yonghua ;
Zong, Ming .
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2014, 8933 :499-512
[6]   Semi-Supervised SVM With Extended Hidden Features [J].
Dong, Aimei ;
Chung, Fu-Lai ;
Deng, Zhaohong ;
Wang, Shitong .
IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (12) :2924-2937
[7]   A Novel Template Reduction Approach for the K-Nearest Neighbor Method [J].
Fayed, Hatem A. ;
Atiya, Amir F. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (05) :890-896
[8]   Fuzzy Broad Learning System: A Novel Neuro-Fuzzy Model for Regression and Classification [J].
Feng, Shuang ;
Chen, C. L. Philip .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (02) :414-424
[9]  
Góra G, 2002, LECT NOTES ARTIF INT, V2430, P111
[10]   Bayesian Takagi-Sugeno-Kang Fuzzy Classifier [J].
Gu, Xiaoqing ;
Chung, Fu-Lai ;
Wang, Shitong .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2017, 25 (06) :1655-1671