Nonparametric Budgeted Stochastic Gradient Descent

被引:0
|
作者
Trung Le [1 ]
Vu Nguyen [1 ]
Tu Dinh Nguyen [1 ]
Dinh Phung [1 ]
机构
[1] Deakin Univ, Pattern Recognit & Data Analyt, Geelong, Vic, Australia
关键词
ONLINE; ALGORITHMS; PERCEPTRON;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the most challenging problems in kernel online learning is to bound the model size. Budgeted kernel online learning addresses this issue by bounding the model size to a predefined budget. However, determining an appropriate value for such predefined budget is arduous. In this paper, we propose the Nonparametric Budgeted Stochastic Gradient Descent that allows the model size to automatically grow with data in a principled way. We provide theoretical analysis to show that our framework is guaranteed to converge for a large collection of loss functions (e.g. Hinge, Logistic, L2, L1, and epsilon-insensitive) which enables the proposed algorithm to perform both classification and regression tasks without hurting the ideal convergence rate O (1/T) of the standard Stochastic Gradient Descent. We validate our algorithm on the real-world datasets to consolidate the theoretical claims.
引用
收藏
页码:564 / 572
页数:9
相关论文
共 50 条
  • [1] Large-scale support vector regression with budgeted stochastic gradient descent
    Zongxia Xie
    Yingda Li
    International Journal of Machine Learning and Cybernetics, 2019, 10 : 1529 - 1541
  • [2] Large-scale support vector regression with budgeted stochastic gradient descent
    Xie, Zongxia
    Li, Yingda
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (06) : 1529 - 1541
  • [3] Breaking the Curse of Kernelization: Budgeted Stochastic Gradient Descent for Large-Scale SVM Training
    Wang, Zhuang
    Crammer, Koby
    Vucetic, Slobodan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 3103 - 3131
  • [4] Breaking the curse of kernelization: Budgeted stochastic gradient descent for large-scale SVM training
    Wang, Z. (zhuang.wang@siemens.com), 1600, Microtome Publishing (13):
  • [5] A SIEVE STOCHASTIC GRADIENT DESCENT ESTIMATOR FOR ONLINE NONPARAMETRIC REGRESSION IN SOBOLEV ELLIPSOIDS
    Zhang, Tianyu
    Simon, Noah
    ANNALS OF STATISTICS, 2022, 50 (05): : 2848 - 2871
  • [6] Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model
    Berthier, Raphael
    Bach, Francis
    Gaillard, Pierre
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [7] BPPGD: Budgeted Parallel Primal grAdient desCent Kernel SVM on Spark
    Sai, Jinchen
    Wang, Bai
    Wu, Bin
    2016 IEEE FIRST INTERNATIONAL CONFERENCE ON DATA SCIENCE IN CYBERSPACE (DSC 2016), 2016, : 74 - 79
  • [8] Unforgeability in Stochastic Gradient Descent
    Baluta, Teodora
    Nikolic, Ivica
    Jain, Racchit
    Aggarwal, Divesh
    Saxena, Prateek
    PROCEEDINGS OF THE 2023 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, CCS 2023, 2023, : 1138 - 1152
  • [9] Preconditioned Stochastic Gradient Descent
    Li, Xi-Lin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (05) : 1454 - 1466
  • [10] Stochastic Reweighted Gradient Descent
    El Hanchi, Ayoub
    Stephens, David A.
    Maddison, Chris J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,