Mixture correntropy-based robust distance metric learning for classification

被引:3
作者
Yuan, Chao [1 ]
Zhou, Changsheng [1 ]
Peng, Jigen [1 ]
Li, Haiyang [1 ]
机构
[1] Guangzhou Univ, Sch Math & Informat Sci, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
Metric learning; Classification; Mixture correntropy; Laplacian kernel; Noise insensitivity; REGRESSION; NONCONVEX;
D O I
10.1016/j.knosys.2024.111791
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Metric learning is a branch of machine learning that aims to learn from the given training data a valid distance metric, with which the similarity between samples can be more effectively evaluated for classification. Metric learning has attracted significant attention, and a large number of models have been proposed in the past few years. However, the traditional methods adopt hinge loss which easily leads to noise sensitivity and instability. In this paper, to improve the robustness performance, we develop a mixture correntropy criterion where two Laplacian kernel functions are combined as the kernel function and induce a more general nonconvex robust loss function by the mixture correntropy. The properties related to the loss function are analysed and presented. The induced loss amalgamates the superiors of the state-of-the-art robust loss functions and is more effective. With this induced loss, we establish a robust metric learning model (called MCML) and design an effective iterative algorithm to optimize the nonconvex challenging problem. The computational complexity and convergence of algorithm are discussed in theory. Furthermore, a boosting version of MCML (BMCML) is derived, where the low -rank basis learning is jointly optimized with the metric to better uncover the data structure. Finally, extensive experiments are conducted on artificial datasets, UCI benchmark datasets and image datasets. The experimental results verify the robustness and effectiveness of the proposed methods.
引用
收藏
页数:20
相关论文
共 83 条
[41]  
Maronna R.A., 2006, Robust statistics: Theory and methods
[42]   Kernel-Based Distance Metric Learning for Supervised k-Means Clustering [J].
Nguyen, Bac ;
De Baets, Bernard .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (10) :3084-3095
[43]   Robust penalized logistic regression with truncated loss functions [J].
Park, Seo Young ;
Liu, Yufeng .
CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2011, 39 (02) :300-323
[44]   Automatic Subspace Learning via Principal Coefficients Embedding [J].
Peng, Xi ;
Lu, Jiwen ;
Yi, Zhang ;
Yan, Rui .
IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (11) :3583-3596
[45]  
Perez Zamora F., 2000, Publicacion Especial - Estacion Experimental Agroindustrial "Obispo Colombres" de Tucuman, P1
[46]   Distance Metric Learning Using Dropout: A Structured Regularization Approach [J].
Qian, Qi ;
Hu, Juhua ;
Jin, Rong ;
Pei, Jian ;
Zhu, Shenghuo .
PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, :323-332
[47]   A Convex Model for Support Vector Distance Metric Learning [J].
Ruan, Yibang ;
Xiao, Yanshan ;
Hao, Zhifeng ;
Liu, Bo .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (08) :3533-3546
[48]  
Shen CH, 2012, J MACH LEARN RES, V13, P1007
[49]   Support vector machine classifier with truncated pinball loss [J].
Shen, Xin ;
Niu, Lingfeng ;
Qi, Zhiquan ;
Tian, Yingjie .
PATTERN RECOGNITION, 2017, 68 :199-210
[50]   Training DCNN by Combining Max-Margin, Max-Correlation Objectives, and Correntropy Loss for Multilabel Image Classification [J].
Shi, Weiwei ;
Gong, Yihong ;
Tao, Xiaoyu ;
Zheng, Nanning .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) :2896-2908