A Local Mean Representation-based K-Nearest Neighbor Classifier

被引:62
作者
Gou, Jianping [1 ,4 ]
Qiu, Wenmo [1 ,4 ]
Yi, Zhang [2 ]
Xu, Yong [3 ]
Mao, Qirong [4 ]
Zhan, Yongzhao [4 ]
机构
[1] Jiangsu Univ, Jiangsu Key Lab Secur Technol Ind Cyberspace, Zhenjiang 212013, Jiangsu, Peoples R China
[2] Sichuan Univ, Sch Comp Sci, Chengdu 610065, Sichuan, Peoples R China
[3] Harbin Inst Technol, Shenzhen Grad Sch, Biocomp Res Ctr, Shenzhen 518055, Guangdong, Peoples R China
[4] Jiangsu Univ, Sch Comp Sci & Telecommun Engn, Zhenjiang 212013, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
K-nearest neighbor classification; local mean vector; representation; pattern recognition; STATISTICAL COMPARISONS; RULE; EVOLUTIONARY; ALGORITHMS; SELECTION;
D O I
10.1145/3319532
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
K-nearest neighbor classification method (KNN), as one of the top 10 algorithms in data mining, is a very simple and yet effective nonparametric technique for pattern recognition. However, due to the selective sensitiveness of the neighborhood size k, the simple majority vote, and the conventional metric measure, the KNN-based classification performance can be easily degraded, especially in the small training sample size cases. In this article, to further improve the classification performance and overcome the main issues in the KNN-based classification, we propose a local mean representation-based k-nearest neighbor classifier (LMRKNN). In the LMRKNN, the categorical k-nearest neighbors of a query sample are first chosen to calculate the corresponding categorical k-local mean vectors, and then the query sample is represented by the linear combination of the categorical k-local mean vectors; finally, the class-specific representation-based distances between the query sample and the categorical k-local mean vectors are adopted to determine the class of the query sample. Extensive experiments on many UCI and KEEL datasets and three popular face databases are carried out by comparing LMRKNN to the state-of-art KNN-based methods. The experimental results demonstrate that the proposed LMRKNN outperforms the related competitive KNN-based methods with more robustness and effectiveness.
引用
收藏
页数:25
相关论文
共 51 条
[41]   A Simple and Efficient Artificial Bee Colony Algorithm [J].
Xu, Yunfeng ;
Fan, Ping ;
Yuan, Ling .
MATHEMATICAL PROBLEMS IN ENGINEERING, 2013, 2013
[42]   From classifiers to discriminators: A nearest neighbor rule induced discriminant analysis [J].
Yang, Jian ;
Zhang, Lei ;
Yang, Jing-yu ;
Zhang, David .
PATTERN RECOGNITION, 2011, 44 (07) :1387-1402
[43]   Hybrid k-Nearest Neighbor Classifier [J].
Yu, Zhiwen ;
Chen, Hantao ;
Liu, Jiming ;
You, Jane ;
Leung, Hareton ;
Han, Guoqiang .
IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (06) :1263-1275
[44]   Nonparametric classification based on local mean and class statistics [J].
Zeng, Yong ;
Yang, Yupu ;
Zhao, Liang .
EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (04) :8443-8448
[45]   Pseudo nearest neighbor rule for pattern classification [J].
Zeng, Yong ;
Yang, Yupu ;
Zhao, Liang .
EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (02) :3587-3595
[46]  
Zhang Jianwu, 2014, CHINESE J POPULATION, V2014, P47
[47]   Component-based global k-NN classifier for small sample size problems [J].
Zhang, Nan ;
Yang, Jian ;
Qian, Jian-jun .
PATTERN RECOGNITION LETTERS, 2012, 33 (13) :1689-1694
[48]   Efficient kNN Classification With Different Numbers of Nearest Neighbors [J].
Zhang, Shichao ;
Li, Xuelong ;
Zong, Ming ;
Zhu, Xiaofeng ;
Wang, Ruili .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (05) :1774-1785
[49]   Learning k for kNN Classification [J].
Zhang, Shichao ;
Li, Xuelong ;
Zong, Ming ;
Zhu, Xiaofeng ;
Cheng, Debo .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2017, 8 (03)
[50]   Combining L1-norm and L2-norm based sparse representations for face recognition [J].
Zhu, Qi ;
Zhang, Daoqiang ;
Sun, Han ;
Li, Zhengming .
OPTIK, 2015, 126 (7-8) :719-724