Information-Based Optimal Subdata Selection for Big Data Linear Regression

被引:160
|
作者
Wang, HaiYing [1 ]
Yang, Min [2 ]
Stufken, John [3 ]
机构
[1] Univ Connecticut, Dept Stat, Mansfield, CT USA
[2] Univ Illinois, Dept Math Stat & Comp Sci, Chicago, IL USA
[3] Arizona State Univ, Sch Math & Stat Sci, Tempe, AZ 85287 USA
基金
美国国家科学基金会;
关键词
D-optimality; Information matrix; Linear regression; Massive data; Subdata;
D O I
10.1080/01621459.2017.1408468
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Extraordinary amounts of data are being produced in many branches of science. Proven statistical methods are no longer applicable with extraordinary large datasets due to computational limitations. A critical step in big data analysis is data reduction. Existing investigations in the context of linear regression focus on subsampling-based methods. However, not only is this approach prone to sampling errors, it also leads to a covariance matrix of the estimators that is typically bounded from below by a term that is of the order of the inverse of the subdata size. We propose a novel approach, termed information-based optimal subdata selection (IBOSS). Compared to leading existing subdata methods, the IBOSS approach has the following advantages: (i) it is significantly faster; (ii) it is suitable for distributed parallel computing; (iii) the variances of the slope parameter estimators converge to 0 as the full data size increases even if the subdata size is fixed, that is, the convergence rate depends on the full data size; (iv) data analysis for IBOSS subdata is straightforward and the sampling distribution of an IBOSS estimator is easy to assess. Theoretical results and extensive simulations demonstrate that the IBOSS approach is superior to subsampling-based methods, sometimes by orders of magnitude. The advantages of the new approach are also illustrated through analysis of real data. Supplementary materials for this article are available online.
引用
收藏
页码:393 / 405
页数:13
相关论文
共 50 条
  • [21] Uncertainty Based Optimal Sample Selection for Big Data
    Ajmal, Saadia
    Ashfaq, Rana Aamir Raza
    Saleem, Kashif
    IEEE ACCESS, 2023, 11 : 6284 - 6292
  • [22] Bayesian scale mixtures of normals linear regression and Bayesian quantile regression with big data and variable selection
    Chu, Yuanqi
    Yin, Zhouping
    Yu, Keming
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2023, 428
  • [23] Mutual Information-based Feature Selection from Set-valued Data
    Shu, Wenhao
    Qian, Wenbin
    2014 IEEE 26TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI), 2014, : 733 - 739
  • [24] Application of mutual information-based sequential feature selection to ISBSG mixed data
    Fernandez-Diego, Marta
    Gonzalez-Ladron-de-Guevara, Fernando
    SOFTWARE QUALITY JOURNAL, 2018, 26 (04) : 1299 - 1325
  • [25] Application of mutual information-based sequential feature selection to ISBSG mixed data
    Marta Fernández-Diego
    Fernando González-Ladrón-de-Guevara
    Software Quality Journal, 2018, 26 : 1299 - 1325
  • [26] Quality of information-based source assessment and selection
    Lin, Yaojin
    Hu, Xuegang
    Wu, Xindong
    NEUROCOMPUTING, 2014, 133 : 95 - 102
  • [27] Mutual information-based feature selection for radiomics
    Oubel, Estanislao
    Beaumont, Hubert
    Iannessi, Antoine
    MEDICAL IMAGING 2016: PACS AND IMAGING INFORMATICS: NEXT GENERATION AND INNOVATIONS, 2016, 9789
  • [28] On Mutual Information-Based Optimal Quantizer Design
    Dulek, Berkan
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (05) : 1008 - 1011
  • [29] Information-based Parameterization of the Log-linear Model for Categorical Data Analysis
    Girardin, Valerie
    Lequesne, Justine
    Ricordeau, Anne
    METHODOLOGY AND COMPUTING IN APPLIED PROBABILITY, 2018, 20 (04) : 1105 - 1121
  • [30] Instance-optimal information-based voting
    Chierichetti, Flavio
    THEORETICAL COMPUTER SCIENCE, 2025, 1024