Unregularized Online Algorithms with Varying Gaussians

被引:2
|
作者
Wang, Baobin [1 ]
Hu, Ting [2 ]
机构
[1] South Cent Univ Nationalities, Sch Math & Stat, Wuhan 430074, Peoples R China
[2] Wuhan Univ, Sch Math & Stat, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
Online learning; Varying Gaussian kernels; Reproducing kernel Hilbert spaces; Geometric noise condition; Learning rate;
D O I
10.1007/s00365-021-09536-3
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Gaussians are a family of Mercer kernels which are widely used in machine learning and statistics. The variance of a Gaussian kernel reflexes the specific structure of the reproducing kernel Hilbert spaces (RKHS) induced by the Gaussian or other important features of learning problems such as the frequency of function components. As the variance of the Gaussian decreases, the learning performance and approximation ability will improve. This paper introduces the unregularized online algorithm with decreasing Gaussians where no regularization term is imposed and the samples are presented in sequence. With the appropriate step sizes, concrete learning rates are derived under the smoothness assumptions on the target function, for which are used to bound the approximation error. Additionally, a new type of the geometric noise condition is proposed to estimate the approximation error instead of any smoothness assumption. It is more general than the work in Steinwart et al. (Ann Stat 35(2):575-607, 2007), for which is only suitable for the hinge loss. An essential estimate is to bound the difference of the approximation functions generated by varying Gaussian RKHS. Fourier transform plays a crucial role in our analysis.
引用
收藏
页码:403 / 440
页数:38
相关论文
共 50 条
  • [31] Converting Online Algorithms to Local Computation Algorithms
    Mansour, Yishay
    Rubinstein, Aviad
    Vardi, Shai
    Xie, Ning
    AUTOMATA, LANGUAGES, AND PROGRAMMING, ICALP 2012 PT I, 2012, 7391 : 653 - 664
  • [32] Near-Optimal Algorithms for Gaussians with Huber Contamination: Mean Estimation and Linear Regression
    Diakonikolas, Ilias
    Kane, Daniel M.
    Pensia, Ankit
    Pittas, Thanasis
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [33] Parallel varying mutation genetic algorithms
    Aguirre, HE
    Tanaka, K
    CEC'02: PROCEEDINGS OF THE 2002 CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1 AND 2, 2002, : 795 - 800
  • [34] Online Algorithms with Multiple Predictions
    Anand, Keerti
    Ge, Rong
    Kumar, Amit
    Panigrahi, Debmalya
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 582 - 598
  • [35] ONLINE ALGORITHMS FOR DIVISION AND MULTIPLICATION
    TRIVEDI, KS
    ERCEGOVAC, MD
    IEEE TRANSACTIONS ON COMPUTERS, 1977, 26 (07) : 681 - 687
  • [36] Online phase detection algorithms
    Nagpurkar, Priya
    Hind, Michael
    Krintzt, Chandra
    Sweeney, Peter F.
    Rajan, V. T.
    CGO 2006: 4TH INTERNATIONAL SYMPOSIUM ON CODE GENERATION AND OPTIMIZATION, 2006, : 111 - +
  • [37] Formal Analysis of Online Algorithms
    Aminof, Benjamin
    Kupferman, Orna
    Lampert, Robby
    AUTOMATED TECHNOLOGY FOR VERIFICATION AND ANALYSIS, 2011, 6996 : 213 - +
  • [38] ONLINE ALGORITHMS FOR LOCATING CHECKPOINTS
    BERN, M
    GREENE, DH
    RAGHUNATHAN, A
    SUDAN, M
    ALGORITHMICA, 1994, 11 (01) : 33 - 52
  • [39] Convergence analysis of online algorithms
    Ying, Yiming
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2007, 27 (03) : 273 - 291
  • [40] Online Algorithms for Basestation Allocation
    Thangaraj, Andrew
    Vaze, Rahul
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2014, 13 (05) : 2966 - 2975