Distance metric learning with local multiple kernel embedding

被引:2
作者
Zhang, Qingshuo [1 ]
Tsang, Eric C. C. [1 ]
He, Qiang [2 ]
Hu, Meng [1 ]
机构
[1] Macau Univ Sci & Technol, Fac Informat Technol, Taipa, Macao, Peoples R China
[2] Beijing Univ Civil Engn & Architecture, Sch Sci, Beijing 100044, Peoples R China
基金
中国国家自然科学基金;
关键词
Multiple kernel learning; Metric learning; Gating function; Kernel weight;
D O I
10.1007/s13042-021-01487-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distance metric learning aims to learn a data-dependent similarity measure, which is widely employed in machine learning. Recently, metric learning algorithms that incorporate multiple kernel learning have shown promising outcomes for classification tasks. However, the multiple kernel learning part of the existing metric learning with multiple kernel just uses a linear combination form of different kernel functions, where each kernel shares the same weight in the entire input space, thus the potential local structure of samples located at different locations in the input space is ignored. To address the aforementioned issues, in this paper, we propose a distance metric learning approach with local multiple kernel embedding (DMLLMK) for small datasets. The weight of each kernel function in DMLLMK is assigned locally, so that there are many different values of weight in each kernel space. This local weight method enables metric learning to capture more information in the data. Our proposed DMLLMK adjusts the kernel weight by using a gating function; moreover, the kernel weight locally depends on the input data. The metric of metric learning and the parameters of the gating function are optimized simultaneously by an alternating learning process. The DMLLMK makes metric learning applicable to small datasets by constructing constraints on the set of similar pairs and dissimilar pairs such that some data are reused, and they produce different constraints on the model. In addition, regularization techniques are used to keep DMLLMK more conservative and prevent overfitting on small data. The experimental results of our proposed method when compared with other metric learning methods on the benchmark dataset show that our proposed DMLLMK is effective.
引用
收藏
页码:79 / 92
页数:14
相关论文
共 44 条
[1]   Robust metric learning based on the rescaled hinge loss [J].
Al-Obaidi, Sumia Abdulhussien Razooqi ;
Zabihzadeh, Davood ;
Hajiabadi, Hamideh .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2020, 11 (11) :2515-2528
[2]  
[Anonymous], 2010, Advaners in neural information processing systems
[3]  
[Anonymous], 2012, NIPS
[4]  
Cai Xinyuan., 2012, ACM MULTIMEDIA MM 12, P749, DOI DOI 10.1145/2393347.2396303
[5]   Similarity Metric Learning for Face Recognition [J].
Cao, Qiong ;
Ying, Yiming ;
Li, Peng .
2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, :2408-2415
[6]  
CORTES C, 1995, MACH LEARN, V20, P273, DOI 10.1023/A:1022627411411
[7]   NEAREST NEIGHBOR PATTERN CLASSIFICATION [J].
COVER, TM ;
HART, PE .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1967, 13 (01) :21-+
[8]  
Demsar J, 2006, J MACH LEARN RES, V7, P1
[9]  
Dhillon IS, 2007, IEEE T PATTERN ANAL, V29, P1944, DOI 10.1109/TP'AMI.2007.1115
[10]   DAML: Domain Adaptation Metric Learning [J].
Geng, Bo ;
Tao, Dacheng ;
Xu, Chao .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2011, 20 (10) :2980-2989