Distance Metric Learning for Large Margin Nearest Neighbor Classification

被引:0
|
作者
Weinberger, Kilian Q. [1 ]
Saul, Lawrence K. [2 ]
机构
[1] Yahoo Res, Santa Clara, CA USA
[2] Univ Calif San Diego, Dept Comp Sci & Engn, La Jolla, CA 92093 USA
基金
美国国家科学基金会;
关键词
convex optimization; semi-definite programming; Mahalanobis distance; metric learning; multi-class classification; support vector machines;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The accuracy of k-nearest neighbor (kNN) classification depends significantly on the metric used to compute distances between different examples. In this paper, we show how to learn a Mahalanobis distance metric for kNN classification from labeled examples. The Mahalanobis metric can equivalently be viewed as a global linear transformation of the input space that precedes kNN classification using Euclidean distances. In our approach, the metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin. As in support vector machines (SVMs), the margin criterion leads to a convex optimization based on the hinge loss. Unlike learning in SVMs, however, our approach requires no modification or extension for problems in multiway (as opposed to binary) classification. In our framework, the Mahalanobis distance metric is obtained as the solution to a semidefinite program. On several data sets of varying size and difficulty, we find that metrics trained in this way lead to significant improvements in kNN classification. Sometimes these results can be further improved by clustering the training examples and learning an individual metric within each cluster. We show how to learn and combine these local metrics in a globally integrated manner.
引用
收藏
页码:207 / 244
页数:38
相关论文
共 50 条
  • [41] Shared Nearest Neighbor Calibration for Few-Shot Classification
    Qi, Rundong
    Ning, Sa
    Jiang, Yong
    Zhang, Yuwei
    Yang, Wenyu
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IV, 2024, 14428 : 3 - 14
  • [42] FRLDM: Empowering K-nearest Neighbor (KNN) through FPGA-based Reduced-rank Local Distance Metric
    Samiee, Ashkan
    Huang, Yinjie
    Bai, Yu
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 4742 - 4746
  • [43] Distance metric learning with the Universum
    Bac Nguyen
    Morell, Carlos
    De Baets, Bernard
    PATTERN RECOGNITION LETTERS, 2017, 100 : 37 - 43
  • [44] Hubness-aware shared neighbor distances for high-dimensional -nearest neighbor classification
    Tomasev, Nenad
    Mladenic, Dunja
    KNOWLEDGE AND INFORMATION SYSTEMS, 2014, 39 (01) : 89 - 122
  • [45] A Neural Network Based Distance Function for the k-Nearest Neighbor Classifier
    Vajda, Szilard
    Szocs, Barna
    2014 14th International Conference on Frontiers in Handwriting Recognition (ICFHR), 2014, : 429 - 433
  • [46] Metric learning-guidedknearest neighbor multilabel classifier
    Ma, Jiajun
    Zhou, Shuisheng
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (07) : 2411 - 2425
  • [47] Distance-Dependent Metric Learning
    Chen, Xianhong
    He, Liang
    Xu, Can
    Liu, Jia
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (02) : 357 - 361
  • [48] Learning Distance Metric for Support Vector Machine: A Multiple Kernel Learning Approach
    Zhang, Weiqi
    Yan, Zifei
    Xiao, Gang
    Zhang, Hongzhi
    Zuo, Wangmeng
    NEURAL PROCESSING LETTERS, 2019, 50 (03) : 2899 - 2923
  • [49] Joint Learning of Distance Metric and Kernel Classifier via Multiple Kernel Learning
    Zhang, Weiqi
    Yan, Zifei
    Zhang, Hongzhi
    Zuo, Wangmeng
    PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 586 - 600
  • [50] Learning Distance Metric for Support Vector Machine: A Multiple Kernel Learning Approach
    Weiqi Zhang
    Zifei Yan
    Gang Xiao
    Hongzhi Zhang
    Wangmeng Zuo
    Neural Processing Letters, 2019, 50 : 2899 - 2923