Hypersphere anchor loss for K-Nearest neighbors

被引:0
|
作者
Xiang Ye
Zihang He
Heng Wang
Yong Li
机构
[1] Beijing University of Posts and Communication,School of Electronic Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
K-Nearest neighbors; Convolutional neural network; Image classification; Loss function;
D O I
暂无
中图分类号
学科分类号
摘要
Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over 1%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1\%$$\end{document}, and the computational cost decreases to less than 10%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$10\%$$\end{document}.
引用
收藏
页码:30319 / 30328
页数:9
相关论文
共 50 条
  • [1] Hypersphere anchor loss for K-Nearest neighbors
    Ye, Xiang
    He, Zihang
    Wang, Heng
    Li, Yong
    APPLIED INTELLIGENCE, 2023, 53 (24) : 30319 - 30328
  • [2] AutoML for Stream k-Nearest Neighbors Classification
    Bahri, Maroua
    Veloso, Bruno
    Bifet, Albert
    Gama, Joao
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 597 - 602
  • [3] A new approach for increasing K-nearest neighbors performance
    Aamer, Youssef
    Benkaouz, Yahya
    Ouzzif, Mohammed
    Bouragba, Khalid
    2020 8TH INTERNATIONAL CONFERENCE ON WIRELESS NETWORKS AND MOBILE COMMUNICATIONS (WINCOM 2020), 2020, : 35 - 39
  • [4] Conformal transformation of the metric for k-nearest neighbors classification
    Popescu, Marius Claudiu
    Grama, Lacrimioara
    Rusu, Corneliu
    2020 IEEE 16TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING (ICCP 2020), 2020, : 229 - 234
  • [5] Efficient k-nearest neighbors search in graph space
    Abu-Aisheh, Zeina
    Raveaux, Romain
    Ramel, Jean-Yves
    PATTERN RECOGNITION LETTERS, 2020, 134 (134) : 77 - 86
  • [6] Human Sleep Scoring Based on K-Nearest Neighbors
    Qureshi, Shahnawaz
    Karrila, Seppo
    Vanichayobon, Sirirut
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2018, 26 (06) : 2802 - +
  • [7] Robustness verification of k-nearest neighbors by abstract interpretation
    Fassina, Nicolo
    Ranzato, Francesco
    Zanella, Marco
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (08) : 4825 - 4859
  • [8] A K-nearest neighbors survival probability prediction method
    Lowsky, D. J.
    Ding, Y.
    Lee, D. K. K.
    McCulloch, C. E.
    Ross, L. F.
    Thistlethwaite, J. R.
    Zenios, S. A.
    STATISTICS IN MEDICINE, 2013, 32 (12) : 2062 - 2069
  • [9] k-Nearest Neighbors for automated classification of celestial objects
    Li LiLi
    Zhang YanXia
    Zhao YongHeng
    SCIENCE IN CHINA SERIES G-PHYSICS MECHANICS & ASTRONOMY, 2008, 51 (07): : 916 - 922
  • [10] k-nearest neighbors prediction and classification for spatial data
    Mohamed-Salem Ahmed
    Mamadou N’diaye
    Mohammed Kadi Attouch
    Sophie Dabo-Niange
    Journal of Spatial Econometrics, 2023, 4 (1):