Hypersphere anchor loss for K-Nearest neighbors

被引:1
作者
Ye, Xiang [1 ]
He, Zihang [1 ]
Wang, Heng [1 ]
Li, Yong [1 ]
机构
[1] Beijing Univ Posts & Commun, Sch Elect Engn, Beijing 100876, Peoples R China
基金
中国国家自然科学基金;
关键词
K-Nearest neighbors; Convolutional neural network; Image classification; Loss function; CLASSIFICATION;
D O I
10.1007/s10489-023-05148-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in R-n feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in R-n feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over 1%, and the computational cost decreases to less than 10%.
引用
收藏
页码:30319 / 30328
页数:10
相关论文
共 22 条
  • [1] Efficient k-nearest neighbors search in graph space
    Abu-Aisheh, Zeina
    Raveaux, Romain
    Ramel, Jean-Yves
    [J]. PATTERN RECOGNITION LETTERS, 2020, 134 (134) : 77 - 86
  • [2] BARRIENTOS RJ, 2022, J SUPERCOMPUT, P1
  • [3] NEAREST NEIGHBOR PATTERN CLASSIFICATION
    COVER, TM
    HART, PE
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 1967, 13 (01) : 21 - +
  • [4] Improving Convolutional Neural Networks' Accuracy in Noisy Environments Using k-Nearest Neighbors
    Gallego, Antonio-Javier
    Pertusa, Antonio
    Calvo-Zaragoza, Jorge
    [J]. APPLIED SCIENCES-BASEL, 2018, 8 (11):
  • [5] Clustering-based k-nearest neighbor classification for large-scale data with neural codes representation
    Gallego, Antonio-Javier
    Calvo-Zaragoza, Jorge
    Valero-Mas, Jose J.
    Rico-Juan, Juan R.
    [J]. PATTERN RECOGNITION, 2018, 74 : 531 - 543
  • [6] Ghosh S, 2022, COMPUTERS MAT CONTIN, V71
  • [7] Goldberger J., 2004, Adv. Neural Inf. Process.Syst, V17, P513
  • [8] Smaller, Faster & Lighter KNN Graph Constructions
    Guerraoui, Rachid
    Kermarrec, Anne-Marie
    Ruas, Olivier
    Taiani, Francois
    [J]. WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 1060 - 1070
  • [9] Norm-Based Binary Search Trees for Speeding Up KNN Big Data Classification
    Hassanat, Ahmad B. A.
    [J]. COMPUTERS, 2018, 7 (04)
  • [10] Furthest-Pair-Based Binary Search Tree for Speeding Big Data Classification Using K-Nearest Neighbors
    Hassanat, Ahmad B. A.
    [J]. BIG DATA, 2018, 6 (03) : 225 - 235