Mixture correntropy for robust learning

被引:144
作者
Chen, Badong [1 ]
Wang, Xin [1 ]
Lu, Na [1 ]
Wang, Shiyuan [2 ]
Cao, Jiuwen [3 ]
Qin, Jing [4 ]
机构
[1] Xi An Jiao Tong Univ, Sch Elect & Informat Engn, Xian 710049, Shaanxi, Peoples R China
[2] Southwest Univ, Coll Elect & Informat Engn, Chongqing 400715, Peoples R China
[3] Hangzhou Dianzi Univ, Inst Informat & Control, Hangzhou 310018, Zhejiang, Peoples R China
[4] Hong Kong Polytech Univ, Sch Nursing, Ctr Smart Hlth, Hong Kong, Hong Kong, Peoples R China
基金
国家重点研发计划;
关键词
Correntropy; Mixture correntropy; Robust learning; Extreme learning machine; Kernel adaptive filtering; MACHINE; REGRESSION;
D O I
10.1016/j.patcog.2018.02.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Correntropy is a local similarity measure defined in kernel space, hence can combat large outliers in robust signal processing and machine learning. So far, many robust learning algorithms have been developed under the maximum correntropy criterion (MCC), among which, a Gaussian kernel is generally used in correntropy. To further improve the learning performance, in this paper we propose the concept of mixture correntropy, which uses the mixture of two Gaussian functions as the kernel function. Some important properties of the mixture correntropy are presented. Applications of the maximum mixture correntropy criterion (MMCC) to extreme learning machine (ELM) and kernel adaptive filtering (KAF) for function approximation and data regression are also studied. Experimental results show that the learning algorithms under MMCC can perform very well and achieve better performance than the conventional MCC based algorithms as well as several other state-of-the-art algorithms. (C) 2018 Elsevier Ltd. All rights reserved.
引用
收藏
页码:318 / 327
页数:10
相关论文
共 56 条
[31]   Maximum correntropy unscented filter [J].
Liu, Xi ;
Chen, Badong ;
Xu, Bin ;
Wu, Zongze ;
Honeine, Paul .
INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2017, 48 (08) :1607-1615
[32]   Maximum correntropy criterion based sparse adaptive filtering algorithms for robust channel estimation under non-Gaussian environments [J].
Ma, Wentao ;
Qu, Hua ;
Gui, Guan ;
Xu, Li ;
Zhao, Jihong ;
Chen, Badong .
JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2015, 352 (07) :2708-2727
[33]   A comparison of the performance of some extreme learning machine empirical models for predicting daily horizontal diffuse solar radiation in a region of southern Iran [J].
Nazhad, Seyed Hossein Hosseini ;
Lotfinejad, Mohammad Mehdi ;
Danesh, Malihe ;
ul Amin, Rooh ;
Shamshirband, Shahaboddin .
INTERNATIONAL JOURNAL OF REMOTE SENSING, 2017, 38 (23) :6894-6909
[34]  
Nikias C. L., 1995, Signal Processing with Alpha-Stable Distributions and Applications
[35]   Online Prediction of Time Series Data With Kernels [J].
Richard, Cedric ;
Bermudez, Jose Carlos M. ;
Honeine, Paul .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2009, 57 (03) :1058-1067
[36]   Generalized correlation function:: Definition, properties, and application to blind equalization [J].
Santamaria, Ignacio ;
Pokharel, Puskal P. ;
Principe, Jose C. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2006, 54 (06) :2187-2197
[37]   Training DCNN by Combining Max-Margin, Max-Correlation Objectives, and Correntropy Loss for Multilabel Image Classification [J].
Shi, Weiwei ;
Gong, Yihong ;
Tao, Xiaoyu ;
Zheng, Nanning .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) :2896-2908
[38]   Support vector echo-state machine for chaotic time-series prediction [J].
Shi, Zhiwei ;
Han, Min .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (02) :359-372
[39]   The C-loss function for pattern classification [J].
Singh, Abhishek ;
Pokharel, Rosha ;
Principe, Jose .
PATTERN RECOGNITION, 2014, 47 (01) :441-453
[40]  
Sonnenburg S, 2006, J MACH LEARN RES, V7, P1531