Hypersphere anchor loss for K-Nearest neighbors

被引:0
|
作者
Xiang Ye
Zihang He
Heng Wang
Yong Li
机构
[1] Beijing University of Posts and Communication,School of Electronic Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
K-Nearest neighbors; Convolutional neural network; Image classification; Loss function;
D O I
暂无
中图分类号
学科分类号
摘要
Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over 1%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1\%$$\end{document}, and the computational cost decreases to less than 10%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$10\%$$\end{document}.
引用
收藏
页码:30319 / 30328
页数:9
相关论文
共 50 条
  • [21] Two-level K-nearest neighbors approach for invasive plants detection and classification
    Guo, Yanhui
    Du, Chunlai
    Zhao, Yun
    Ting, Tih-Fen
    Rothfus, Thomas A.
    APPLIED SOFT COMPUTING, 2021, 108
  • [22] A novel ranked k-nearest neighbors algorithm for missing data imputation
    Khan, Yasir
    Shah, Said Farooq
    Asim, Syed Muhammad
    JOURNAL OF APPLIED STATISTICS, 2025, 52 (05) : 1103 - 1127
  • [23] Exploring Target Identification for Drug Design with K-Nearest Neighbors' Algorithm
    Jimenes-Vargas, Karina
    Perez-Castillo, Yunierkis
    Tejera, Eduardo
    Munteanu, Cristian R.
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 219 - 227
  • [24] Weather Prediction and Classification Using Neural Networks and k-Nearest Neighbors
    Mantri, Rhea
    Raghavendra, Kulkarni Rakshit
    Puri, Harshita
    Chaudhary, Jhanavi
    Bingi, Kishore
    2021 8TH INTERNATIONAL CONFERENCE ON SMART COMPUTING AND COMMUNICATIONS (ICSCC), 2021, : 263 - 268
  • [25] A quantum k-nearest neighbors algorithm based on the Euclidean distance estimation
    Zardini, Enrico
    Blanzieri, Enrico
    Pastorello, Davide
    QUANTUM MACHINE INTELLIGENCE, 2024, 6 (01)
  • [26] Brief Announcement: Efficient Distributed Algorithms for the K-Nearest Neighbors Problem
    Fathi, Reza
    Molla, Anisur Rahaman
    Pandurangan, Gopal
    PROCEEDINGS OF THE 32ND ACM SYMPOSIUM ON PARALLELISM IN ALGORITHMS AND ARCHITECTURES (SPAA '20), 2020, : 527 - 529
  • [27] Incremental k-Nearest Neighbors Using Reservoir Sampling for Data Streams
    Bahri, Maroua
    Bifet, Albert
    DISCOVERY SCIENCE (DS 2021), 2021, 12986 : 122 - 137
  • [28] Work in Progress: K-Nearest Neighbors Techniques for ABAC Policies Clustering
    Benkaouz, Yahya
    Erradi, Mohammed
    Freisleben, Bernd
    ABAC'16: PROCEEDINGS OF THE 2016 ACM INTERNATIONAL WORKSHOP ON ATTRIBUTE BASED ACCESS CONTROL, 2016, : 72 - 75
  • [29] Identification of model order and number of neighbors for k-nearest neighbor resampling
    Lee, Taesam
    Ouarda, Taha B. M. J.
    JOURNAL OF HYDROLOGY, 2011, 404 (3-4) : 136 - 145
  • [30] K-Nearest Neighbors Classifier for Field Bit Error Rate Data
    Allogba, Stephanie
    Tremblay, Christine
    2018 ASIA COMMUNICATIONS AND PHOTONICS CONFERENCE (ACP), 2018,