Distributed Learning with Regularized Least Squares

被引:1
|
作者
Lin, Shao-Bo [1 ]
Guo, Xin [2 ]
Zhou, Ding-Xuan [1 ]
机构
[1] City Univ Hong Kong, Dept Math, Tat Chee Ave, Kowloon, Hong Kong, Peoples R China
[2] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hong Kong, Peoples R China
关键词
Distributed learning; divide-and-conquer; error analysis; integral operator; second order decomposition; KERNEL; ALGORITHMS; REGRESSION; RATES; OPERATORS; NETWORKS; GRADIENT; THEOREM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study distributed learning with the least squares regularization scheme in a reproducing kernel Hilbert space (RKHS). By a divide-and-conquer approach, the algorithm partitions a data set into disjoint data subsets, applies the least squares regularization scheme to each data subset to produce an output function, and then takes an average of the individual output functions as a final global estimator or predictor. We show with error bounds and learning rates in expectation in both the L-2-metric and RKHS-metric that the global output function of this distributed learning is a good approximation to the algorithm processing the whole data in one single machine. Our derived learning rates in expectation are optimal and stated in a general setting without any eigenfunction assumption. The analysis is achieved by a novel second order decomposition of operator differences in our integral operator approach. Even for the classical least squares regularization scheme in the RKHS associated with a general kernel, we give the best learning rate in expectation in the literature.
引用
收藏
页数:31
相关论文
共 50 条
  • [1] Distributed regularized least squares with flexible Gaussian kernels
    Hu, Ting
    Zhou, Ding-Xuan
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2021, 53 : 349 - 377
  • [2] Online regularized pairwise learning with least squares loss
    Wang, Cheng
    Hu, Ting
    ANALYSIS AND APPLICATIONS, 2020, 18 (01) : 49 - 78
  • [3] The regularized least squares algorithm and the problem of learning halfspaces
    Minh, Ha Quang
    INFORMATION PROCESSING LETTERS, 2011, 111 (08) : 395 - 401
  • [4] Kernel-Based Regularized Learning with Random Projections: Beyond Least Squares
    Liu, Jiamin
    Gao, Junzhuo
    Lian, Heng
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2025, 7 (01): : 253 - 273
  • [5] Discriminatively regularized least-squares classification
    Xue, Hui
    Chen, Songcan
    Yang, Qiang
    PATTERN RECOGNITION, 2009, 42 (01) : 93 - 104
  • [6] Error analysis of distributed least squares ranking
    Chen, Hong
    Li, Han
    Pan, Zhibin
    NEUROCOMPUTING, 2019, 361 : 222 - 228
  • [7] Optimality of regularized least squares ranking with imperfect kernels
    He, Fangchao
    Zeng, Yu
    Zheng, Lie
    Wu, Qiang
    INFORMATION SCIENCES, 2022, 589 : 564 - 579
  • [8] Regularized least squares potential SVRs
    Jayadeva
    Deb, Alok Kanti
    Kbemchandani, Reshma
    Chandra, Suresh
    2006 ANNUAL IEEE INDIA CONFERENCE, 2006, : 565 - +
  • [9] Regularized least-squares regression: Learning from a β-mixing sequence
    Farahmand, Amir-Massoud
    Szepesvari, Csaba
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2012, 142 (02) : 493 - 505
  • [10] Partial least squares with a regularized weight
    Fu, Yingxiong
    Peng, Jiangtao
    Dong, Xuemei
    JOURNAL OF MATHEMATICAL CHEMISTRY, 2016, 54 (02) : 403 - 415