Distance Metric Learning Using Dropout: A Structured Regularization Approach

被引:2
作者
Qian, Qi [1 ]
Hu, Juhua [2 ]
Jin, Rong [1 ]
Pei, Jian [2 ]
Zhu, Shenghuo [3 ]
机构
[1] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
[2] Simon Fraser Univ, Sch Comp Sci, Burnaby, BC V5A 1S6, Canada
[3] NEC Labs, Cupertino, CA 95014 USA
来源
PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14) | 2014年
基金
加拿大自然科学与工程研究理事会;
关键词
Distance Metric Learning; Dropout; NOISE;
D O I
10.1145/2623330.2623678
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distance metric learning (DML) aims to learn a distance metric better than Euclidean distance. It has been successfully applied to various tasks, e.g., classification, clustering and information retrieval. Many DML algorithms suffer from the over-fitting problem because of a large number of parameters to be determined in DML. In this paper, we exploit the dropout technique, which has been successfully applied in deep learning to alleviate the over-fitting problem, for DML. Different from the previous studies that only apply dropout to training data, we apply dropout to both the learned metrics and the training data. We illustrate that application of dropout to DML is essentially equivalent to matrix norm based regularization. Compared with the standard regularization scheme in DML, dropout is advantageous in simulating the structured regularizers which have shown consistently better performance than non structured regularizers. We verify, both empirically and theoretically, that dropout is effective in regulating the learned metric to avoid the over-fitting problem. Last, we examine the idea of wrapping the dropout technique in the state-of-art DML methods and observe that the dropout technique can significantly improve the performance of the original DML methods.
引用
收藏
页码:323 / 332
页数:10
相关论文
共 29 条
[1]  
[Anonymous], 2006, DISTANCE METRIC LEAR
[2]  
[Anonymous], 2013, Advances in Neural Information Processing Systems, DOI DOI 10.48550/ARXIV.1307.1493
[3]  
[Anonymous], 2004, P 21 INT C
[4]  
[Anonymous], 2013, P 30 INT C MACHINE L
[5]  
[Anonymous], 2011, P 24 ANN C LEARN THE
[6]  
[Anonymous], 2011, ABS11043250 CORR
[7]  
[Anonymous], 2002, NIPS
[8]  
[Anonymous], 2011, NIPS
[9]  
[Anonymous], 2012, ABS12070580 CORR
[10]  
[Anonymous], 2003, P 20 INT C MACH ICML