Least Squares Model Averaging for Distributed Data

被引:0
|
作者
Zhang, Haili [1 ]
Liu, Zhaobo [2 ]
Zou, Guohua [3 ]
机构
[1] Shenzhen Polytech Univ, Inst Appl Math, Shenzhen 518055, Peoples R China
[2] Shenzhen Univ, Inst Adv Study, Shenzhen 518060, Peoples R China
[3] Capital Normal Univ, Sch Math Sci, Beijing 100048, Peoples R China
基金
中国国家自然科学基金;
关键词
consistency; distributed data; divide and conquer algorithm; Mallows' criterion; model averaging; optimality; FOCUSED INFORMATION CRITERION; BIG DATA; REGRESSION; SELECTION; INFERENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Divide and conquer algorithm is a common strategy applied in big data. Model averaging has the natural divide-and-conquer feature, but its theory has not been developed in big data scenarios. The goal of this paper is to fill this gap. We propose two divide-and conquer-type model averaging estimators for linear models with distributed data. Under some regularity conditions, we show that the weights from Mallows model averaging criterion converge in L-2 to the theoretically optimal weights minimizing the risk of the model averaging estimator. We also give the bounds of the in-sample and out-of-sample mean squared errors and prove the asymptotic optimality for the proposed model averaging estimators. Our conclusions hold even when the dimensions and the number of candidate models are divergent. Simulation results and a real airline data analysis illustrate that the proposed model averaging methods perform better than the commonly used model selection and model averaging methods in distributed data cases. Our approaches contribute to model averaging theory in distributed data and parallel computations, and can be applied in big data analysis to save time and reduce the computational burden.
引用
收藏
页数:59
相关论文
共 50 条
  • [21] Distributed Learning with Regularized Least Squares
    Lin, Shao-Bo
    Guo, Xin
    Zhou, Ding-Xuan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [22] Corrected Mallows criterion for model averaging
    Liao, Jun
    Zou, Guohua
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2020, 144
  • [23] Interval Estimation by Frequentist Model Averaging
    Wang, Haiying
    Zhou, Sherry Z. F.
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2013, 42 (23) : 4342 - 4356
  • [24] Distributed iteratively reweighted least squares and applications
    Chen, Colin
    STATISTICS AND ITS INTERFACE, 2013, 6 (04) : 585 - 593
  • [25] Monitoring Least Squares Models of Distributed Streams
    Gabel, Moshe
    Keren, Daniel
    Schuster, Assaf
    KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, : 319 - 328
  • [26] Cross-Validation Model Averaging for Generalized Functional Linear Model
    Zhang, Haili
    Zou, Guohua
    ECONOMETRICS, 2020, 8 (01)
  • [27] Fast and Accurate Least-Mean-Squares Solvers for High Dimensional Data
    Maalouf, Alaa
    Jubran, Ibrahim
    Feldman, Dan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (12) : 9977 - 9994
  • [28] Parallelizing Stochastic Gradient Descent for Least Squares Regression: Mini-batching, Averaging, and Model Misspecification
    Jain, Prateek
    Netrapalli, Praneeth
    Kakade, Sham M.
    Kidambi, Rahul
    Sidford, Aaron
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [29] OPTIMAL MODEL AVERAGING ESTIMATION FOR PARTIALLY LINEAR MODELS
    Zhang, Xinyu
    Wang, Wendun
    STATISTICA SINICA, 2019, 29 (02) : 693 - 718
  • [30] Model averaging by jackknife criterion in models with dependent data
    Zhang, Xinyu
    Wan, Alan T. K.
    Zou, Guohua
    JOURNAL OF ECONOMETRICS, 2013, 174 (02) : 82 - 94