The Nystrom minimum kernel risk-sensitive loss algorithm with k-means sampling

被引:5
作者
Zhang, Tao [1 ,2 ]
Wang, Shiyuan [1 ,2 ]
Huang, Xuewei [1 ,2 ]
Wang, Lin [1 ,2 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing 400715, Peoples R China
[2] Chongqing Key Lab Nonlinear Circuits & Intelligen, Chongqing 400715, Peoples R China
来源
JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS | 2020年 / 357卷 / 14期
基金
中国国家自然科学基金;
关键词
CORRENTROPY; FILTERS;
D O I
10.1016/j.jfranklin.2020.07.050
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The minimum kernel risk-sensitive loss (MKRSL) algorithm has been developed to improve the filtering accuracy and robustness of kernel least mean square (KLMS) in non-Gaussian noises. However, the linear growth of network size in MKRSL leads to a huge burden on time consumption and memory requirement. To curb this growth issue, a novel Nystrom minimum kernel risk-sensitive loss with k-means sampling (NysMKRSL-KM) algorithm is proposed by using the Nystrom method combined with k-means sampling to approximate the kernel matrix of MKRSL in this paper. The proposed NysMKRSL-KM algorithm with low time and storage complexity achieves a comparable performance to kernel adaptive filters (KAFs). In addition, the energy conservation relation and the sufficient condition of NysMKRSL-KM are obtained for performing theoretical analysis and guaranteeing the mean square convergence, respectively. The steady-state excess mean square errors (EMSEs) of NysMKRSL-KM for different noises are therefore derived for evaluating accuracy theoretically. Monte Carlo simulations are conducted to validate the theoretical analysis results and superiorities of the proposed NysMKRSL-KM algorithm. (C) 2020 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:10082 / 10099
页数:18
相关论文
共 39 条
[1]  
Al-Naffouri T. Y., 2001, EURASIP Journal on Applied Signal Processing, V2001, P192, DOI 10.1155/S1110865701000348
[2]   Spectral methods in machine learning and new strategies for very large datasets [J].
Belabbas, Mohamed-Ali ;
Wolfe, Patrick J. .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2009, 106 (02) :369-374
[3]   Kernel Risk-Sensitive Loss: Definition, Properties and Application to Robust Adaptive Filtering [J].
Chen, Badong ;
Xing, Lei ;
Xu, Bin ;
Zhao, Haiquan ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (11) :2888-2901
[4]   Steady-State Mean-Square Error Analysis for Adaptive Filtering under the Maximum Correntropy Criterion [J].
Chen, Badong ;
Xing, Lei ;
Liang, Junli ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE SIGNAL PROCESSING LETTERS, 2014, 21 (07) :880-884
[5]   Quantized Kernel Least Mean Square Algorithm [J].
Chen, Badong ;
Zhao, Songlin ;
Zhu, Pingping ;
Principe, Jose C. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (01) :22-32
[6]  
Drineas P, 2005, J MACH LEARN RES, V6, P2153
[7]   The kernel recursive least-squares algorithm [J].
Engel, Y ;
Mannor, S ;
Meir, R .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (08) :2275-2285
[8]   Kernel K-Means Sampling for Nystrom Approximation [J].
He, Li ;
Zhang, Hong .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (05) :2108-2120
[9]   Block-Sparsity-Induced Adaptive Filter for Multi-Clustering System Identification [J].
Jiang, Shuyang ;
Gu, Yuantao .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (20) :5318-5330
[10]   Online learning with kernels [J].
Kivinen, J ;
Smola, AJ ;
Williamson, RC .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (08) :2165-2176