Fast and strong convergence of online learning algorithms

被引:0
|
作者
Zheng-Chu Guo
Lei Shi
机构
[1] Zhejiang University,School of Mathematical Sciences
[2] Fudan University,Shanghai Key Laboratory for Contemporary Applied Mathematics, School of Mathematical Sciences
来源
关键词
Learning theory; Online learning; Capacity dependent error analysis; Strong convergence in an RKHS; 68Q32; 68T05; 62J02; 62L20;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper, we study the online learning algorithm without explicit regularization terms. This algorithm is essentially a stochastic gradient descent scheme in a reproducing kernel Hilbert space (RKHS). The polynomially decaying step size in each iteration can play a role of regularization to ensure the generalization ability of online learning algorithm. We develop a novel capacity dependent analysis on the performance of the last iterate of online learning algorithm. This answers an open problem in learning theory. The contribution of this paper is twofold. First, our novel capacity dependent analysis can lead to sharp convergence rate in the standard mean square distance which improves the results in the literature. Second, we establish, for the first time, the strong convergence of the last iterate with polynomially decaying step sizes in the RKHS norm. We demonstrate that the theoretical analysis established in this paper fully exploits the fine structure of the underlying RKHS, and thus can lead to sharp error estimates of online learning algorithm.
引用
收藏
页码:2745 / 2770
页数:25
相关论文
共 50 条
  • [1] Fast and strong convergence of online learning algorithms
    Guo, Zheng-Chu
    Shi, Lei
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2019, 45 (5-6) : 2745 - 2770
  • [2] Fast Convergence of Online Pairwise Learning Algorithms
    Boissier, Martin
    Lyu, Siwei
    Ying, Yiming
    Zhou, Ding-Xuan
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 204 - 212
  • [3] Convergence of Unregularized Online Learning Algorithms
    Lei, Yunwen
    Shi, Lei
    Guo, Zheng-Chu
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [4] Online Learning Algorithms Can Converge Comparably Fast as Batch Learning
    Lin, Junhong
    Zhou, Ding-Xuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (06) : 2367 - 2378
  • [5] Convergence analysis of online algorithms
    Ying, Yiming
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2007, 27 (03) : 273 - 291
  • [6] Convergence analysis of online algorithms
    Yiming Ying
    Advances in Computational Mathematics, 2007, 27 : 273 - 291
  • [7] CONVERGENCE OF LEARNING ALGORITHMS
    DEVYATER.IP
    KAPLINSK.AI
    TSYPKIN, YZ
    AUTOMATION AND REMOTE CONTROL, 1969, (10) : 1619 - &
  • [8] Non-monotonic convergence of online learning algorithms for perceptrons with noisy teacher
    Ikeda, Kazushi
    Honda, Arata
    Hanzawa, Hiroaki
    Miyoshi, Seiji
    NEURAL NETWORKS, 2018, 102 : 21 - 26
  • [9] Online Learning Algorithms
    Steve Smale
    Yuan Yao
    Foundations of Computational Mathematics, 2006, 6 : 145 - 170
  • [10] Online Learning Algorithms
    Cesa-Bianchi, Nicolo
    Orabona, Francesco
    ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 8, 2021, 2021, 8 : 165 - 190