A Local Mean Representation-based K-Nearest Neighbor Classifier

被引:61
作者
Gou, Jianping [1 ,4 ]
Qiu, Wenmo [1 ,4 ]
Yi, Zhang [2 ]
Xu, Yong [3 ]
Mao, Qirong [4 ]
Zhan, Yongzhao [4 ]
机构
[1] Jiangsu Univ, Jiangsu Key Lab Secur Technol Ind Cyberspace, Zhenjiang 212013, Jiangsu, Peoples R China
[2] Sichuan Univ, Sch Comp Sci, Chengdu 610065, Sichuan, Peoples R China
[3] Harbin Inst Technol, Shenzhen Grad Sch, Biocomp Res Ctr, Shenzhen 518055, Guangdong, Peoples R China
[4] Jiangsu Univ, Sch Comp Sci & Telecommun Engn, Zhenjiang 212013, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
K-nearest neighbor classification; local mean vector; representation; pattern recognition; STATISTICAL COMPARISONS; RULE; EVOLUTIONARY; ALGORITHMS; SELECTION;
D O I
10.1145/3319532
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
K-nearest neighbor classification method (KNN), as one of the top 10 algorithms in data mining, is a very simple and yet effective nonparametric technique for pattern recognition. However, due to the selective sensitiveness of the neighborhood size k, the simple majority vote, and the conventional metric measure, the KNN-based classification performance can be easily degraded, especially in the small training sample size cases. In this article, to further improve the classification performance and overcome the main issues in the KNN-based classification, we propose a local mean representation-based k-nearest neighbor classifier (LMRKNN). In the LMRKNN, the categorical k-nearest neighbors of a query sample are first chosen to calculate the corresponding categorical k-local mean vectors, and then the query sample is represented by the linear combination of the categorical k-local mean vectors; finally, the class-specific representation-based distances between the query sample and the categorical k-local mean vectors are adopted to determine the class of the query sample. Extensive experiments on many UCI and KEEL datasets and three popular face databases are carried out by comparing LMRKNN to the state-of-art KNN-based methods. The experimental results demonstrate that the proposed LMRKNN outperforms the related competitive KNN-based methods with more robustness and effectiveness.
引用
收藏
页数:25
相关论文
共 51 条
[1]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[2]  
[Anonymous], AMSTER658
[3]  
[Anonymous], 2012, J. Inf. Comput. Sci
[4]   Locally adaptive k parameter selection for nearest neighbor classifier: one nearest cluster [J].
Bulut, Faruk ;
Amasyali, Mehmet Fatih .
PATTERN ANALYSIS AND APPLICATIONS, 2017, 20 (02) :415-425
[5]   Large margin nearest local mean classifier [J].
Chai, Jing ;
Liu, Hongwei ;
Chen, Bo ;
Bao, Zheng .
SIGNAL PROCESSING, 2010, 90 (01) :236-248
[6]   A Nonnegative Locally Linear KNN model for image recognition [J].
Chen, Si-Bao ;
Xu, Yu-Lan ;
Ding, Chris H. Q. ;
Luo, Bin .
PATTERN RECOGNITION, 2018, 83 :78-90
[7]   NEAREST NEIGHBOR PATTERN CLASSIFICATION [J].
COVER, TM ;
HART, PE .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1967, 13 (01) :21-+
[8]  
Demsar J, 2006, J MACH LEARN RES, V7, P1
[9]   A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms [J].
Derrac, Joaquin ;
Garcia, Salvador ;
Molina, Daniel ;
Herrera, Francisco .
SWARM AND EVOLUTIONARY COMPUTATION, 2011, 1 (01) :3-18
[10]  
DUDANI SA, 1976, IEEE T SYST MAN CYB, V6, P327