Quantized kernel recursive minimum error entropy algorithm

被引:2
作者
Jiang, Wang [1 ]
Gao, Yuyi [1 ]
He, Yue [1 ]
Chen, Shanmou [1 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing, Peoples R China
关键词
Online prediction; Quantized kernel recursive minimum error  entropy; Kernel recursive minimum error entropy; CONVERGENCE; NETWORK;
D O I
10.1016/j.engappai.2023.105957
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a online vector quantization (VQ) method based on the kernel recursive minimum error entropy (KRMEE) algorithm. According to information theoretic learning (ITL), the minimum error entropy criterion (MEE) is robust and can effective resistance to non-Gaussian noise. By combining the kernel recursive least squares (KRLS) algorithm with MEE criterion, KRMEE algorithm has been generated, which has excellent performance in non-Gaussian environments. However, with the size of data increases, the computational complexity will raise. We propose a quantized to solve this problem, the input space of the algorithm is quantized to suppress the linear growth radial basis function (RBF) network in kernel adaptive filtering (KAF). The VQ method is different from novelty criterion (NC), approximate linear dependency (ALD) criterion, and other sparsity methods, the online VQ method need to construct the dictionary, and calculate the distance by Euclidean norm. We propose a novel quantized kernel recursive minimum error entropy (QKRMEE) algorithm by combining VQ method with KRMEE algorithm, and update the solution with a recursive algorithm. In Mackey-Glass time series and a real-world datasets, Monte Carlo simulation experiments show that the proposed algorithm achieves better predictive performance in non-Gaussian noise environment. Meanwhile, the algorithm can restrain the growth of RBF network well, thus reducing the computational complexity and memory consumption effectively.
引用
收藏
页数:8
相关论文
共 30 条
[1]   THEORY OF REPRODUCING KERNELS [J].
ARONSZAJN, N .
TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY, 1950, 68 (MAY) :337-404
[2]   Quantized Minimum Error Entropy Criterion [J].
Chen, Badong ;
Xing, Lei ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (05) :1370-1380
[3]   Kernel Risk-Sensitive Loss: Definition, Properties and Application to Robust Adaptive Filtering [J].
Chen, Badong ;
Xing, Lei ;
Xu, Bin ;
Zhao, Haiquan ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (11) :2888-2901
[4]   Generalized Correntropy for Robust Adaptive Filtering [J].
Chen, Badong ;
Xing, Lei ;
Zhao, Haiquan ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (13) :3376-3387
[5]   Convergence of a Fixed-Point Algorithm under Maximum Correntropy Criterion [J].
Chen, Badong ;
Wang, Jianji ;
Zhao, Haiquan ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (10) :1723-1727
[6]   Quantized Kernel Recursive Least Squares Algorithm [J].
Chen, Badong ;
Zhao, Songlin ;
Zhu, Pingping ;
Principe, Jose C. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (09) :1484-1491
[7]   Quantized Kernel Least Mean Square Algorithm [J].
Chen, Badong ;
Zhao, Songlin ;
Zhu, Pingping ;
Principe, Jose C. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (01) :22-32
[8]   Mean-Square Convergence Analysis of ADALINE Training With Minimum Error Entropy Criterion [J].
Chen, Badong ;
Zhu, Yu ;
Hu, Jinchun .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (07) :1168-1179
[9]   The kernel recursive least-squares algorithm [J].
Engel, Y ;
Mannor, S ;
Meir, R .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (08) :2275-2285
[10]   Boolean delay equations: A simple way of looking at complex systems [J].
Ghil, Michael ;
Zaliapin, Ilya ;
Coluzzi, Barbara .
PHYSICA D-NONLINEAR PHENOMENA, 2008, 237 (23) :2967-2986