Distributed Learning with Regularized Least Squares

被引:1
|
作者
Lin, Shao-Bo [1 ]
Guo, Xin [2 ]
Zhou, Ding-Xuan [1 ]
机构
[1] City Univ Hong Kong, Dept Math, Tat Chee Ave, Kowloon, Hong Kong, Peoples R China
[2] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hong Kong, Peoples R China
关键词
Distributed learning; divide-and-conquer; error analysis; integral operator; second order decomposition; KERNEL; ALGORITHMS; REGRESSION; RATES; OPERATORS; NETWORKS; GRADIENT; THEOREM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study distributed learning with the least squares regularization scheme in a reproducing kernel Hilbert space (RKHS). By a divide-and-conquer approach, the algorithm partitions a data set into disjoint data subsets, applies the least squares regularization scheme to each data subset to produce an output function, and then takes an average of the individual output functions as a final global estimator or predictor. We show with error bounds and learning rates in expectation in both the L-2-metric and RKHS-metric that the global output function of this distributed learning is a good approximation to the algorithm processing the whole data in one single machine. Our derived learning rates in expectation are optimal and stated in a general setting without any eigenfunction assumption. The analysis is achieved by a novel second order decomposition of operator differences in our integral operator approach. Even for the classical least squares regularization scheme in the RKHS associated with a general kernel, we give the best learning rate in expectation in the literature.
引用
收藏
页数:31
相关论文
共 50 条
  • [21] Regularized Partial Least Squares with an Application to NMR Spectroscopy
    Allen, Genevera I.
    Peterson, Christine
    Vannucci, Marina
    Maletic-Savatic, Mirjana
    STATISTICAL ANALYSIS AND DATA MINING, 2013, 6 (04) : 302 - 314
  • [22] Bayesian l0-regularized least squares
    Polson, Nicholas G.
    Sun, Lei
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2019, 35 (03) : 717 - 731
  • [23] Distributed least squares prediction for functional linear regression*
    Tong, Hongzhi
    INVERSE PROBLEMS, 2022, 38 (02)
  • [24] A Distributed Algorithm for Least Squares Solutions
    Wang, Xuan
    Zhou, Jingqiu
    Mou, Shaoshuai
    Corless, Martin J.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2019, 64 (10) : 4217 - 4222
  • [25] Robust and Low Complexity Distributed Kernel Least Squares Learning in Sensor Networks
    Perez-Cruz, Fernando
    Kulkarni, Sanjeev R.
    IEEE SIGNAL PROCESSING LETTERS, 2010, 17 (04) : 355 - 358
  • [26] Least Squares Learning Identification
    Sun Mingxuan
    Bi Hongbo
    2011 30TH CHINESE CONTROL CONFERENCE (CCC), 2011, : 1615 - 1620
  • [27] Statistical and Heuristic Model Selection in Regularized Least-Squares
    Braga, Igor
    Monard, Maria Carolina
    2013 BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2013, : 231 - 236
  • [28] Nuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds
    Liu, Yuanyuan
    Cheng, Hong
    Shang, Fanhua
    Cheng, James
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2014, : 515 - 524
  • [29] Faster SVD-Truncated Regularized Least-Squares
    Boutsidis, Christos
    Magdon-Ismail, Malik
    2014 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2014, : 1321 - 1325
  • [30] Distributed iteratively reweighted least squares and applications
    Chen, Colin
    STATISTICS AND ITS INTERFACE, 2013, 6 (04) : 585 - 593