Unregularized Online Algorithms with Varying Gaussians

被引:2
|
作者
Wang, Baobin [1 ]
Hu, Ting [2 ]
机构
[1] South Cent Univ Nationalities, Sch Math & Stat, Wuhan 430074, Peoples R China
[2] Wuhan Univ, Sch Math & Stat, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
Online learning; Varying Gaussian kernels; Reproducing kernel Hilbert spaces; Geometric noise condition; Learning rate;
D O I
10.1007/s00365-021-09536-3
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Gaussians are a family of Mercer kernels which are widely used in machine learning and statistics. The variance of a Gaussian kernel reflexes the specific structure of the reproducing kernel Hilbert spaces (RKHS) induced by the Gaussian or other important features of learning problems such as the frequency of function components. As the variance of the Gaussian decreases, the learning performance and approximation ability will improve. This paper introduces the unregularized online algorithm with decreasing Gaussians where no regularization term is imposed and the samples are presented in sequence. With the appropriate step sizes, concrete learning rates are derived under the smoothness assumptions on the target function, for which are used to bound the approximation error. Additionally, a new type of the geometric noise condition is proposed to estimate the approximation error instead of any smoothness assumption. It is more general than the work in Steinwart et al. (Ann Stat 35(2):575-607, 2007), for which is only suitable for the hinge loss. An essential estimate is to bound the difference of the approximation functions generated by varying Gaussian RKHS. Fourier transform plays a crucial role in our analysis.
引用
收藏
页码:403 / 440
页数:38
相关论文
共 50 条
  • [1] Unregularized Online Algorithms with Varying Gaussians
    Baobin Wang
    Ting Hu
    Constructive Approximation, 2021, 53 : 403 - 440
  • [2] Convergence of Unregularized Online Learning Algorithms
    Lei, Yunwen
    Shi, Lei
    Guo, Zheng-Chu
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [3] Online Classification with Varying Gaussians
    Hu, Ting
    Zhou, Ding-Xuan
    STUDIES IN APPLIED MATHEMATICS, 2010, 124 (01) : 65 - 83
  • [4] Unregularized online learning algorithms with general loss functions
    Ying, Yiming
    Zhou, Ding-Xuan
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2017, 42 (02) : 224 - 244
  • [5] ONLINE REGRESSION WITH VARYING GAUSSIANS AND NON-IDENTICAL DISTRIBUTIONS
    Hu, Ting
    ANALYSIS AND APPLICATIONS, 2011, 9 (04) : 395 - 408
  • [6] Conditional quantiles with varying Gaussians
    Xiang, Dao-Hong
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2013, 38 (04) : 723 - 735
  • [7] Conditional quantiles with varying Gaussians
    Dao-Hong Xiang
    Advances in Computational Mathematics, 2013, 38 : 723 - 735
  • [8] Logistic classification with varying Gaussians
    Xiang, Dao-Hong
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2011, 61 (02) : 397 - 407
  • [9] Improved Algorithms for Properly Learning Mixture of Gaussians
    Wu, Xuan
    Xie, Changzhi
    THEORETICAL COMPUTER SCIENCE (NCTCS 2018), 2018, 882 : 8 - 26
  • [10] Differentially Private Algorithms for Learning Mixtures of Separated Gaussians
    Kamath, Gautam
    Sheffet, Or
    Singhal, Vikrant
    Ullman, Jonathan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32