Attention-based Local Mean K-Nearest Centroid Neighbor Classifier

被引:11
作者
Ma, Ying [1 ]
Huang, Rui [1 ]
Yan, Ming [2 ]
Li, Guoqi [3 ]
Wang, Tian [4 ]
机构
[1] Xiamen Univ Technol, Dept Comp & Informat Engn, Xiamen 361024, Peoples R China
[2] Agcy Sci Technol & Res, Inst High Performance Comp, Singapore, Singapore
[3] Tsinghua Univ, Ctr Brain Inspired Comp Res, Dept Precis Instrument, Beijing 100084, Peoples R China
[4] Beijing Normal Univ BNU Zhuhai, BNU UIC Inst Artificial Intelligence & Future Net, Zhuhai, Peoples R China
基金
中国国家自然科学基金;
关键词
Attention mechanism; K-Nearest Neighbor; Pattern classification; Data mining; ALGORITHMS; SPEECH; MODEL; RULE;
D O I
10.1016/j.eswa.2022.117159
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Among classic data mining algorithms, the K-Nearest Neighbor (KNN)-based methods are effective and straightforward solutions for the classification tasks. However, most KNN-based methods do not fully consider the impact across different training samples in classification tasks, which leads to performance decline. To address this issue, we propose a method named Attention-based Local Mean K-Nearest Centroid Neighbor Classifier (ALMKNCN), bridging the nearest centroid neighbor computing with the attention mechanism, which fully considers the influence of each training query sample. Specifically, we first calculate the local centroids of each class with the given query pattern. Then, our ALMKNCN introduces the attention mechanism to calculate the weight of pseudo-distance between the test sample to each class centroid. Finally, based on attention coefficient, the distances between the query sample and local mean vectors are weighted to predict the classes for query samples. Extensive experiments are carried out on real data sets and synthetic data sets by comparing ALMKNCN with the state-of-art KNN-based methods. The experimental results demonstrate that our proposed ALMKNCN outperforms the compared methods with large margins.
引用
收藏
页数:11
相关论文
共 54 条
[1]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[2]  
[Anonymous], 2017, J. Inf. Hiding Multimed. Signal Process
[3]  
[Anonymous], 2013, ADV MAT RES
[4]  
Assegie T.A., 2021, J. Robot. Control (JRC), V2, P115, DOI DOI 10.18196/JRC.2363
[5]   Recurrent neural network with pooling operation and attention mechanism for sentiment analysis: A multi-task learning approach [J].
Cai, Yi ;
Huang, Qingbao ;
Lin, Zejun ;
Xu, Jingyun ;
Chen, Zhenhong ;
Li, Qing .
KNOWLEDGE-BASED SYSTEMS, 2020, 203
[6]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297
[7]   NEAREST NEIGHBOR PATTERN CLASSIFICATION [J].
COVER, TM ;
HART, PE .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1967, 13 (01) :21-+
[8]  
Demsar J, 2006, J MACH LEARN RES, V7, P1
[9]  
DUDANI SA, 1976, IEEE T SYST MAN CYB, V6, P327
[10]   A novel version of k nearest neighbor: Dependent nearest neighbor [J].
Ertugrul, Omer Faruk ;
Tagluk, Mehmet Emin .
APPLIED SOFT COMPUTING, 2017, 55 :480-490