Attention-based Local Mean K-Nearest Centroid Neighbor Classifier

被引:11
作者
Ma, Ying [1 ]
Huang, Rui [1 ]
Yan, Ming [2 ]
Li, Guoqi [3 ]
Wang, Tian [4 ]
机构
[1] Xiamen Univ Technol, Dept Comp & Informat Engn, Xiamen 361024, Peoples R China
[2] Agcy Sci Technol & Res, Inst High Performance Comp, Singapore, Singapore
[3] Tsinghua Univ, Ctr Brain Inspired Comp Res, Dept Precis Instrument, Beijing 100084, Peoples R China
[4] Beijing Normal Univ BNU Zhuhai, BNU UIC Inst Artificial Intelligence & Future Net, Zhuhai, Peoples R China
基金
中国国家自然科学基金;
关键词
Attention mechanism; K-Nearest Neighbor; Pattern classification; Data mining; ALGORITHMS; SPEECH; MODEL; RULE;
D O I
10.1016/j.eswa.2022.117159
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Among classic data mining algorithms, the K-Nearest Neighbor (KNN)-based methods are effective and straightforward solutions for the classification tasks. However, most KNN-based methods do not fully consider the impact across different training samples in classification tasks, which leads to performance decline. To address this issue, we propose a method named Attention-based Local Mean K-Nearest Centroid Neighbor Classifier (ALMKNCN), bridging the nearest centroid neighbor computing with the attention mechanism, which fully considers the influence of each training query sample. Specifically, we first calculate the local centroids of each class with the given query pattern. Then, our ALMKNCN introduces the attention mechanism to calculate the weight of pseudo-distance between the test sample to each class centroid. Finally, based on attention coefficient, the distances between the query sample and local mean vectors are weighted to predict the classes for query samples. Extensive experiments are carried out on real data sets and synthetic data sets by comparing ALMKNCN with the state-of-art KNN-based methods. The experimental results demonstrate that our proposed ALMKNCN outperforms the compared methods with large margins.
引用
收藏
页数:11
相关论文
共 54 条
[11]  
Firat O, 2016, Arxiv, DOI arXiv:1601.01073
[12]   CGSPN : cascading gated self-attention and phrase-attention network for sentence modeling [J].
Fu, Yanping ;
Liu, Yun .
JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2021, 56 (01) :147-168
[13]  
Fukunaga K., 1990, INTRO STAT PATTERN R, DOI DOI 10.5555/92131
[14]   Scalable multi-channel dilated CNN-BiLSTM model with attention mechanism for Chinese textual sentiment analysis [J].
Gan, Chenquan ;
Feng, Qingdong ;
Zhang, Zufan .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2021, 118 :297-309
[15]   A Local Mean Representation-based K-Nearest Neighbor Classifier [J].
Gou, Jianping ;
Qiu, Wenmo ;
Yi, Zhang ;
Xu, Yong ;
Mao, Qirong ;
Zhan, Yongzhao .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2019, 10 (03)
[16]   Locality constrained representation-based K-nearest neighbor classification [J].
Gou, Jianping ;
Qiu, Wenmo ;
Yi, Zhang ;
Shen, Xiangjun ;
Zhan, Yongzhao ;
Ou, Weihua .
KNOWLEDGE-BASED SYSTEMS, 2019, 167 :38-52
[17]   A generalized mean distance-based k-nearest neighbor classifier [J].
Gou, Jianping ;
Ma, Hongxing ;
Ou, Weihua ;
Zeng, Shaoning ;
Rao, Yunbo ;
Yang, Hebiao .
EXPERT SYSTEMS WITH APPLICATIONS, 2019, 115 :356-372
[18]   A Local Mean-Based k-Nearest Centroid Neighbor Classifier [J].
Gou, Jianping ;
Yi, Zhang ;
Du, Lan ;
Xiong, Taisong .
COMPUTER JOURNAL, 2012, 55 (09) :1058-1071
[19]  
Guo GD, 2003, LECT NOTES COMPUT SC, V2888, P986
[20]  
Han E.-H., 2001, Advances in Knowledge Discovery and Data Mining, Advances in Knowledge Discovery and Data Mining, P53, DOI [10.1007/3-540-45357-1_9, DOI 10.1007/3-540-45357-19]