Hypersphere anchor loss for K-Nearest neighbors

被引:0
|
作者
Xiang Ye
Zihang He
Heng Wang
Yong Li
机构
[1] Beijing University of Posts and Communication,School of Electronic Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
K-Nearest neighbors; Convolutional neural network; Image classification; Loss function;
D O I
暂无
中图分类号
学科分类号
摘要
Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over 1%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1\%$$\end{document}, and the computational cost decreases to less than 10%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$10\%$$\end{document}.
引用
收藏
页码:30319 / 30328
页数:9
相关论文
共 50 条
  • [31] On Application of a Probabilistic K-Nearest Neighbors Model for Cluster Validation Problem
    Volkovich, Zeev
    Barzily, Zeev
    Avros, Renata
    Toledano-Kitai, Dvora
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2011, 40 (16) : 2997 - 3010
  • [32] Density peaks clustering algorithm with K-nearest neighbors and weighted similarity
    Zhao J.
    Chen L.
    Wu R.-X.
    Zhang B.
    Han L.-Z.
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2022, 39 (12): : 2349 - 2357
  • [33] Research on the Humanlike Trajectories Control of Robots Based on the K-Nearest Neighbors
    Wang Lei
    Liu Zhaowei
    2017 CHINESE AUTOMATION CONGRESS (CAC), 2017, : 7746 - 7751
  • [34] Weighted K-nearest neighbors classification based on Whale optimization algorithm
    Anvari, S.
    Azgomi, M. Abdollahi
    Dishabi, M. R. Ebrahimi
    Maheri, M.
    IRANIAN JOURNAL OF FUZZY SYSTEMS, 2023, 20 (03): : 61 - 74
  • [35] Incorporating Fitness Inheritance and k-Nearest Neighbors for Evolutionary Dynamic Optimization
    Liaw, Rung-Tzuo
    Ting, Chuan-Kang
    2018 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2018, : 1345 - 1352
  • [36] Oversampling by genetic algorithm and k-nearest neighbors for network intrusion problem
    Jindaluang, Wattana
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 43 (03) : 2515 - 2528
  • [37] Applying k-nearest neighbors to time series forecasting: Two new approaches
    Tajmouati, Samya
    Wahbi, Bouazza E. L.
    Bedoui, Adel
    Abarda, Abdallah
    Dakkon, Mohamed
    JOURNAL OF FORECASTING, 2024, 43 (05) : 1559 - 1574
  • [38] Emotion recognition using speckle pattern analysis and k-nearest neighbors classification
    Lupa Yitzhak, Hadas
    Tzabari Kelman, Yarden
    Moskovenko, Alexey
    Zhovnerchuk, Evgenii
    Zalevsky, Zeev
    JOURNAL OF OPTICS, 2021, 23 (01)
  • [39] Hybrid Genetic Algorithm With k-Nearest Neighbors for Radial Distribution Network Reconfiguration
    Jo, Seungchan
    Oh, Jae-Young
    Lee, Jaeho
    Oh, Seokhwa
    Moon, Hee Seung
    Zhang, Chen
    Gadh, Rajit
    Yoon, Yong Tae
    IEEE TRANSACTIONS ON SMART GRID, 2024, 15 (03) : 2614 - 2624
  • [40] A New Version of the Dendritic Cell Immune Algorithm Based on the K-Nearest Neighbors
    Ben Ali, Kaouther
    Chelly, Zeineb
    Elouedi, Zied
    NEURAL INFORMATION PROCESSING, PT I, 2015, 9489 : 688 - 695