Learning a Mahalanobis metric from equivalence constraints

被引:0
作者
Bar-Hillel, AB [1 ]
Hertz, T
Shental, N
Weinshall, D
机构
[1] Hebrew Univ Jerusalem, Sch Engn & Comp Sci, IL-91904 Jerusalem, Israel
[2] Hebrew Univ Jerusalem, Ctr Neural Computat, IL-91904 Jerusalem, Israel
关键词
clustering; metric learning; dimensionality reduction; equivalence constraints; side information;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Many learning algorithms use a metric defined over the input space as a principal tool, and their performance critically depends on the quality of this metric. We address the problem of learning metrics using side-information in the form of equivalence constraints. Unlike labels, we demonstrate that this type of side-information can sometimes be automatically obtained without the need of human intervention. We show how such side-information can be used to modify the representation of the data, leading to improved clustering and classification. Specifically, we present the Relevant Component Analysis (RCA) algorithm, which is a simple and efficient algorithm for learning a Mahalanobis metric. We show that RCA is the solution of an interesting optimization problem, founded on an information theoretic basis. If dimensionality reduction is allowed within RCA, we show that it is optimally accomplished by a version of Fisher's linear discriminant that uses constraints. Moreover, under certain Gaussian assumptions, RCA can be viewed as a Maximum Likelihood estimation of the within class covariance matrix. We conclude with extensive empirical evaluations of RCA, showing its advantage over alternative methods.
引用
收藏
页码:937 / 965
页数:29
相关论文
共 50 条
  • [21] Semi-supervised metric learning in stratified spaces via intergrating local constraints and information-theoretic non-local constraints
    Karimi, Zohre
    Ghidary, Saeed Shiry
    NEUROCOMPUTING, 2018, 312 : 165 - 176
  • [22] Metric Learning from Imbalanced Data
    Gautheron, Leo
    Habrard, Amaury
    Morvant, Emilie
    Sebban, Marc
    2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 923 - 930
  • [23] Metric Learning from Probabilistic Labels
    Huai, Mengdi
    Miao, Chenglin
    Li, Yaliang
    Suo, Qiuling
    Su, Lu
    Zhang, Aidong
    KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 1541 - 1550
  • [24] LogDet Divergence-Based Metric Learning With Triplet Constraints and Its Applications
    Mei, Jiangyuan
    Liu, Meizhu
    Karimi, Hamid Reza
    Gao, Huijun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2014, 23 (11) : 4920 - 4931
  • [25] DBCAMM: A novel density based clustering algorithm via using the Mahalanobis metric
    Ren, Yan
    Liu, Xiaodong
    Liu, Wanquan
    APPLIED SOFT COMPUTING, 2012, 12 (05) : 1542 - 1554
  • [26] A novel approach for ear recognition: learning Mahalanobis distance features from deep CNNs
    Omara, Ibrahim
    Hagag, Ahmed
    Ma, Guangzhi
    Abd El-Samie, Fathi E.
    Song, Enmin
    MACHINE VISION AND APPLICATIONS, 2021, 32 (01)
  • [27] Semi-supervised dimensionality reduction using pairwise equivalence constraints
    Cevikalp, Hakan
    Verbeek, Jakob
    Jurie, Frederic
    Klaser, Alexander
    VISAPP 2008: PROCEEDINGS OF THE THIRD INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY AND APPLICATIONS, VOL 1, 2008, : 489 - 496
  • [28] Metric Learning as a Service With Covariance Embedding
    Kamal, Imam Mustafa
    Bae, Hyerim
    Liu, Ling
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2023, 16 (05) : 3508 - 3522
  • [29] Transferable Deep Metric Learning for Clustering
    Chehboune, Mohamed Alami
    Kaddah, Rim
    Read, Jesse
    ADVANCES IN INTELLIGENT DATA ANALYSIS XXI, IDA 2023, 2023, 13876 : 15 - 28
  • [30] A kernel approach for semisupervised metric learning
    Yeung, Dit-Yan
    Chang, Hong
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01): : 141 - 149