Hypersphere anchor loss for K-Nearest neighbors

被引:0
|
作者
Xiang Ye
Zihang He
Heng Wang
Yong Li
机构
[1] Beijing University of Posts and Communication,School of Electronic Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
K-Nearest neighbors; Convolutional neural network; Image classification; Loss function;
D O I
暂无
中图分类号
学科分类号
摘要
Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over 1%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1\%$$\end{document}, and the computational cost decreases to less than 10%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$10\%$$\end{document}.
引用
收藏
页码:30319 / 30328
页数:9
相关论文
共 50 条
  • [41] Evolutionary Feature Scaling in K-Nearest Neighbors Based on Label Dispersion Minimization
    Basak, Suryoday
    Huber, Manfred
    2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 928 - 935
  • [42] Solar Forecasting by K-Nearest Neighbors Method with Weather Classification and Physical Model
    Liu, Zhao
    Zhang, Ziang
    2016 NORTH AMERICAN POWER SYMPOSIUM (NAPS), 2016,
  • [43] K-Nearest Neighbors Gaussian Process Regression for Urban Radio Map Reconstruction
    Zhang, Yifang
    Wang, Shaowei
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (12) : 3049 - 3053
  • [44] Local generalized quadratic distance metrics: application to the k-nearest neighbors classifier
    Abou-Moustafa, Karim
    Ferrie, Frank P.
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2018, 12 (02) : 341 - 363
  • [45] Unsupervised image clustering algorithm based on contrastive learning and K-nearest neighbors
    Zhang, Xiuling
    Wang, Shuo
    Wu, Ziyun
    Tan, Xiaofei
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (09) : 2415 - 2423
  • [46] Classification of Contaminated Insulators Using k-Nearest Neighbors Based on Computer Vision
    Corso, Marcelo Picolotto
    Perez, Fabio Luis
    Stefenon, Stefano Frizzo
    Yow, Kin-Choong
    Garcia Ovejero, Raul
    Quietinho Leithardt, Valderi Reis
    COMPUTERS, 2021, 10 (09)
  • [47] Unsupervised image clustering algorithm based on contrastive learning and K-nearest neighbors
    Xiuling Zhang
    Shuo Wang
    Ziyun Wu
    Xiaofei Tan
    International Journal of Machine Learning and Cybernetics, 2022, 13 : 2415 - 2423
  • [48] Implementation of K-Nearest Neighbors face recognition on low-power processor
    Setiawan, Eko
    Muttaqin, Adharul
    Telkomnika (Telecommunication Computing Electronics and Control), 2015, 13 (03) : 949 - 954
  • [49] Hybrid Feature Selection with K-Nearest Neighbors for Optimal Heart Failure Detection
    Prayogo, Rizal Dwi
    Karimah, Siti Amatullah
    2022 12TH INTERNATIONAL CONFERENCE ON SYSTEM ENGINEERING AND TECHNOLOGY (ICSET 2022), 2022, : 101 - 105
  • [50] Predicting Protein-Protein Interactions with K-Nearest Neighbors Classification Algorithm
    Guarracino, Mario R.
    Nebbia, Adriano
    COMPUTATIONAL INTELLIGENCE METHODS FOR BIOINFORMATICS AND BIOSTATISTICS, 2010, 6160 : 139 - 150