Distributed Learning with Regularized Least Squares

被引:1
|
作者
Lin, Shao-Bo [1 ]
Guo, Xin [2 ]
Zhou, Ding-Xuan [1 ]
机构
[1] City Univ Hong Kong, Dept Math, Tat Chee Ave, Kowloon, Hong Kong, Peoples R China
[2] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hong Kong, Peoples R China
关键词
Distributed learning; divide-and-conquer; error analysis; integral operator; second order decomposition; KERNEL; ALGORITHMS; REGRESSION; RATES; OPERATORS; NETWORKS; GRADIENT; THEOREM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study distributed learning with the least squares regularization scheme in a reproducing kernel Hilbert space (RKHS). By a divide-and-conquer approach, the algorithm partitions a data set into disjoint data subsets, applies the least squares regularization scheme to each data subset to produce an output function, and then takes an average of the individual output functions as a final global estimator or predictor. We show with error bounds and learning rates in expectation in both the L-2-metric and RKHS-metric that the global output function of this distributed learning is a good approximation to the algorithm processing the whole data in one single machine. Our derived learning rates in expectation are optimal and stated in a general setting without any eigenfunction assumption. The analysis is achieved by a novel second order decomposition of operator differences in our integral operator approach. Even for the classical least squares regularization scheme in the RKHS associated with a general kernel, we give the best learning rate in expectation in the literature.
引用
收藏
页数:31
相关论文
共 50 条
  • [31] Least Squares Model Averaging for Distributed Data
    Zhang, Haili
    Liu, Zhaobo
    Zou, Guohua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [32] Budget Online Learning Algorithm for Least Squares SVM
    Jian, Ling
    Shen, Shuqian
    Li, Jundong
    Liang, Xijun
    Li, Lei
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (09) : 2076 - 2087
  • [33] LEAST SQUARES APPROXIMATIONS IN LINEAR STATISTICALINVERSE LEARNING PROBLEMS
    Helin, Tapio
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2024, 62 (04) : 2025 - 2047
  • [34] Monitoring Least Squares Models of Distributed Streams
    Gabel, Moshe
    Keren, Daniel
    Schuster, Assaf
    KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, : 319 - 328
  • [35] Mitigating Quantization Effects on Distributed Sensor Fusion: A Least Squares Approach
    Zhu, Shanying
    Chen, Cailian
    Xu, Jinming
    Guan, Xinping
    Xie, Lihua
    Johansson, Karl Henrik
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (13) : 3459 - 3474
  • [36] A regularized interior-point method for constrained linear least squares
    Dehghani, Mohsen
    Lambe, Andrew
    Orban, Dominique
    INFOR, 2020, 58 (02) : 202 - 224
  • [37] On a consistent procedure for distributed recursive nonlinear least-squares estimation
    Kar, Soummya
    Moura, Jose M. F.
    Poor, H. Vincent
    2013 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2013, : 891 - 894
  • [38] REGULARIZED SEMI-SUPERVISED LEAST SQUARES REGRESSION WITH DEPENDENT SAMPLES
    Tong, Hongzhi
    Ng, Michael
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2018, 16 (05) : 1347 - 1360
  • [39] Analysis of regularized least-squares in reproducing kernel Krein spaces
    Liu, Fanghui
    Shi, Lei
    Huang, Xiaolin
    Yang, Jie
    Suykens, Johan A. K.
    MACHINE LEARNING, 2021, 110 (06) : 1145 - 1173
  • [40] Distributed Learning With Dependent Samples
    Sun, Zirui
    Lin, Shao-Bo
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (09) : 6003 - 6020