Online learning with kernel regularized least mean square algorithms

被引:10
|
作者
Fan, Haijin [1 ]
Song, Qing [1 ]
Shrestha, Sumit Bam [1 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
关键词
Kernel method; Dictionary; Cumulative coherence; Diagonally dominant; Weight convergence;
D O I
10.1016/j.knosys.2014.02.005
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a novel type of kernel least mean square algorithm with regularized structural risk for online learning. In order to curb the continuous growing of kernel functions, a new dictionary selection method based on the cumulative coherence measure is applied to perform the sparsification procedure, which can obtain a dictionary with diagonally dominant Gram matrix under certain conditions. On the updating of the kernel weight, the linear least mean square algorithm is generalized into the reproducing kernel Hilbert space (RKHS) with minimized updating structural risk and it results in a kernel regularized least mean square (KRLMS) algorithm. A simplified version of the KRLMS algorithm is also presented by applying only partial updating information to train the algorithm at each iteration, which reduces the computational complexity. Theoretical analysis of their convergence issues is examined and variable learning rates are adopted in the training process which can guarantee the weight convergence of the algorithm in terms of a bounded measurement error. Several experiments are carried out to prove the effectiveness of the proposed algorithm for online learning compared to some existing kernel algorithms. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:21 / 32
页数:12
相关论文
共 50 条
  • [1] Regularized Kernel Least Mean Square Algorithm with Multiple-delay Feedback
    Wang, Shiyuan
    Zheng, Yunfei
    Ling, Chengxiu
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (01) : 98 - 101
  • [2] Kernel Least Mean Square with Tracking
    Wang, Wanli
    Wang, Shiyuan
    Qian, Guobing
    Yang, Bo
    PROCEEDINGS OF THE 36TH CHINESE CONTROL CONFERENCE (CCC 2017), 2017, : 5100 - 5104
  • [3] Mixture Kernel Least Mean Square
    Pokharel, Rosha
    Seth, Sohan
    Principe, Jose C.
    2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [4] Kernel least mean square with adaptive kernel size
    Chen, Badong
    Liang, Junli
    Zheng, Nanning
    Principe, Jose C.
    NEUROCOMPUTING, 2016, 191 : 95 - 106
  • [5] Kernel Least Mean Square Algorithm With Mixed Kernel
    Sun, Qitang
    Dang, Lujuan
    Wang, Wanli
    Wang, Shiyuan
    PROCEEDINGS OF 2018 TENTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2018, : 140 - 144
  • [6] Mean square convergence analysis for kernel least mean square algorithm
    Chen, Badong
    Zhao, Songlin
    Zhu, Pingping
    Principe, Jose C.
    SIGNAL PROCESSING, 2012, 92 (11) : 2624 - 2632
  • [7] Online Nonlinear Granger Causality Detection by Quantized Kernel Least Mean Square
    Ji, Hong
    Chen, Badong
    Yuan, Zejian
    Zheng, Nanning
    Keil, Andreas
    Principe, Jose C.
    NEURAL INFORMATION PROCESSING (ICONIP 2014), PT II, 2014, 8835 : 68 - 75
  • [8] REGULARIZED LEAST SQUARE KERNEL REGRESSION FOR STREAMING DATA
    Zheng, Xiaoqing
    Sun, Hongwei
    Wu, Qiang
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2021, 19 (06) : 1533 - 1548
  • [9] Quantised kernel least mean square algorithm with a learning vector strategy
    Zhang, Qiangqiang
    Wang, Shiyuan
    ELECTRONICS LETTERS, 2020, 56 (21) : 1146 - 1147
  • [10] Quantized Kernel Least Mean Square Algorithm
    Chen, Badong
    Zhao, Songlin
    Zhu, Pingping
    Principe, Jose C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (01) : 22 - 32