Metric Learning via Penalized Optimization

被引:2
作者
Huang, Hao [1 ]
Peng, Yanan [1 ]
Gan, Ting [1 ]
Tu, Weiping [1 ]
Zhou, Ruiting [2 ]
Wu, Sai [3 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan, Peoples R China
[2] Wuhan Univ, Sch Cyber Sci & Engn, Wuhan, Peoples R China
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Peoples R China
来源
KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING | 2021年
关键词
Metric Learning; Penalized Optimization; Classification;
D O I
10.1145/3447548.3467369
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Metric learning aims to project original data into a new space, where data points can be classified more accurately using kNN or similar types of classification algorithms. To avoid trivial learning results such as indistinguishably projecting the data onto a line, many existing approaches formulate metric learning as a constrained optimization problem, like finding a metric that minimizes the distance between data points from the same class, with a constraint of ensuring a certain separation for data points from different classes, and then they approximate the optimal solution to the constrained optimization in an iterative way. In order to improve the classification accuracy as much as possible, we try to find a metric that is able to minimize the intra-class distance and maximize the inter-class distance simultaneously. Towards this, we formulate metric learning as a penalized optimization problem, and provide design guideline, paradigms with a general formula, as well as two representative instantiations for the penalty term. In addition, we provide an analytical solution for the penalized optimization, with which costly computation can be avoid, and more importantly, there is no need to worry about the convergence rates or approximation ratios any more. Extensive experiments on real-world data sets are conducted, and the results verify the effectiveness and efficiency of our approach.
引用
收藏
页码:656 / 664
页数:9
相关论文
共 32 条
  • [1] Bar-Hillel A., 2003, ICML, P11
  • [2] Bellet A., 2013, SURVEY METRIC LEARNI
  • [3] Deep Metric Learning to Rank
    Cakir, Fatih
    He, Kun
    Xia, Xide
    Kulis, Brian
    Sclaroff, Stan
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 1861 - 1870
  • [4] INDEPENDENT COMPONENT ANALYSIS, A NEW CONCEPT
    COMON, P
    [J]. SIGNAL PROCESSING, 1994, 36 (03) : 287 - 314
  • [5] Dhillon IS, 2007, IEEE T PATTERN ANAL, V29, P1944, DOI 10.1109/TP'AMI.2007.1115
  • [6] Dua D, 2017, UCI MACHINE LEARNING, DOI DOI 10.1016/J.DSS.2009.05.016
  • [7] Goldberger J., 2004, Adv. Neural Inf. Process. Syst., V17, DOI DOI 10.5555/2976040.2976105
  • [8] Locality-Based Discriminant Neighborhood Embedding
    Gou, Jianping
    Yi, Zhang
    [J]. COMPUTER JOURNAL, 2013, 56 (09) : 1063 - 1082
  • [9] TagProp: Discriminative Metric Learning in Nearest Neighbor Models for Image Auto-Annotation
    Guillaumin, Matthieu
    Mensink, Thomas
    Verbeek, Jakob
    Schmid, Cordelia
    [J]. 2009 IEEE 12TH INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2009, : 309 - 316
  • [10] He XF, 2005, IEEE I CONF COMP VIS, P1208