Attention-based Local Mean K-Nearest Centroid Neighbor Classifier

被引:11
作者
Ma, Ying [1 ]
Huang, Rui [1 ]
Yan, Ming [2 ]
Li, Guoqi [3 ]
Wang, Tian [4 ]
机构
[1] Xiamen Univ Technol, Dept Comp & Informat Engn, Xiamen 361024, Peoples R China
[2] Agcy Sci Technol & Res, Inst High Performance Comp, Singapore, Singapore
[3] Tsinghua Univ, Ctr Brain Inspired Comp Res, Dept Precis Instrument, Beijing 100084, Peoples R China
[4] Beijing Normal Univ BNU Zhuhai, BNU UIC Inst Artificial Intelligence & Future Net, Zhuhai, Peoples R China
基金
中国国家自然科学基金;
关键词
Attention mechanism; K-Nearest Neighbor; Pattern classification; Data mining; ALGORITHMS; SPEECH; MODEL; RULE;
D O I
10.1016/j.eswa.2022.117159
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Among classic data mining algorithms, the K-Nearest Neighbor (KNN)-based methods are effective and straightforward solutions for the classification tasks. However, most KNN-based methods do not fully consider the impact across different training samples in classification tasks, which leads to performance decline. To address this issue, we propose a method named Attention-based Local Mean K-Nearest Centroid Neighbor Classifier (ALMKNCN), bridging the nearest centroid neighbor computing with the attention mechanism, which fully considers the influence of each training query sample. Specifically, we first calculate the local centroids of each class with the given query pattern. Then, our ALMKNCN introduces the attention mechanism to calculate the weight of pseudo-distance between the test sample to each class centroid. Finally, based on attention coefficient, the distances between the query sample and local mean vectors are weighted to predict the classes for query samples. Extensive experiments are carried out on real data sets and synthetic data sets by comparing ALMKNCN with the state-of-art KNN-based methods. The experimental results demonstrate that our proposed ALMKNCN outperforms the compared methods with large margins.
引用
收藏
页数:11
相关论文
共 54 条
[31]   A local mean-based nonparametric classifier [J].
Mitani, Y ;
Hamamoto, Y .
PATTERN RECOGNITION LETTERS, 2006, 27 (10) :1151-1159
[32]  
Mnih V, 2014, Arxiv, DOI [arXiv:1406.6247, 10.48550/arXiv.1406.6247, DOI 10.48550/ARXIV.1406.6247]
[33]  
Moritz N, 2019, INT CONF ACOUST SPEE, P5666, DOI 10.1109/ICASSP.2019.8683510
[34]   Adaptive Learning-Based k-Nearest Neighbor Classifiers With Resilience to Class Imbalance [J].
Mullick, Sankha Subhra ;
Datta, Shounak ;
Das, Swagatam .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) :5713-5725
[35]  
Okamoto T, 2020, INT CONF ACOUST SPEE, P6729, DOI [10.1109/icassp40776.2020.9053915, 10.1109/ICASSP40776.2020.9053915]
[36]   A new locally adaptive k-nearest neighbor algorithm based on discrimination class [J].
Pan, Zhibin ;
Wang, Yikun ;
Pan, Yiwei .
KNOWLEDGE-BASED SYSTEMS, 2020, 204
[37]   A new general nearest neighbor classification based on the mutual neighborhood information [J].
Pan, Zhibin ;
Wang, Yidi ;
Ku, Weiping .
KNOWLEDGE-BASED SYSTEMS, 2017, 121 :142-152
[38]  
Quinlan J. R., 2014, C4. 5: programs for machine learning
[39]   On the use of neighbourhood-based non-parametric classifiers [J].
Sanchez, JS ;
Pla, F ;
Ferri, FJ .
PATTERN RECOGNITION LETTERS, 1997, 18 (11-13) :1179-1186
[40]  
Shan CH, 2018, 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), P4764, DOI 10.1109/ICASSP.2018.8462492