The Evolution of Boosting Algorithms From Machine Learning to Statistical Modelling

被引:220
作者
Mayr, A. [1 ]
Binder, H. [2 ]
Gefeller, O. [1 ]
Schmid, M. [1 ,3 ]
机构
[1] Friedrich Alexander Univ Erlangen Nurnberg FAU, Inst Medizininformat Biomet & Epidemiol, D-91054 Erlangen, Germany
[2] Johannes Gutenberg Univ Mainz, Inst Med Biomet Epidemiol & Informat, Mainz, Germany
[3] Univ Bonn, Inst Med Biomet Informat & Epidemio, Bonn, Germany
关键词
Statistical computing; statistical models; algorithms; classification; machine learning; EVIDENCE CONTRARY; REGRESSION; SELECTION; VIEW; CLASSIFICATION; REGULARIZATION; PREDICTION;
D O I
10.3414/ME13-01-0122
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Background: The concept of boosting emerged from the field of machine learning. The basic idea is to boost the accuracy of a weak classifying tool by combining various instances into a more accurate prediction. This general concept was later adapted to the field of statistical modelling. Nowadays, boosting algorithms are often applied to estimate and select predictor effects in statistical regression models. Objectives: This review article attempts to highlight the evolution of boosting algorithms from machine learning to statistical modelling. Methods: We describe the AdaBoost algorithm for classification as well as the two most prominent statistical boosting approaches, gradient boosting and likelihood-based boosting for statistical modelling. We highlight the methodological background and present the most common software implementations. Results: Although gradient boosting and likelihood-based boosting are typically treated separately in the literature, they share the same methodological roots and follow the same fundamental concepts. Compared to the initial machine learning algorithms, which must be seen as black-box prediction schemes, they result in statistical models with a straight-forward interpretation. Conclusions: Statistical boosting algorithms have gained substantial interest during the last decade and offer a variety of options to address important research questions in modern biomedicine.
引用
收藏
页码:419 / 427
页数:9
相关论文
共 56 条
  • [1] [Anonymous], 2014, The R Foundation for Statistical Computing
  • [2] [Anonymous], 2007, Statistical Science, DOI DOI 10.1214/07-STS242A
  • [3] [Anonymous], 2006, PATTERN RECOGN
  • [4] An empirical comparison of voting classification algorithms: Bagging, boosting, and variants
    Bauer, E
    Kohavi, R
    [J]. MACHINE LEARNING, 1999, 36 (1-2) : 105 - 139
  • [5] Tailoring sparse multivariable regression techniques for prognostic single-nucleotide polymorphism signatures
    Binder, H.
    Benner, A.
    Bullinger, L.
    Schumacher, M.
    [J]. STATISTICS IN MEDICINE, 2013, 32 (10) : 1778 - 1791
  • [6] Binder H., 2013, CoxBoost: Cox models by likelihood based boosting for a single survival endpoint or competing risks
  • [7] Binder H, 2011, GAMBOOST GENERALIZED
  • [8] Allowing for mandatory covariates in boosting estimation of sparse high-dimensional survival models
    Binder, Harald
    Schumacher, Martin
    [J]. BMC BIOINFORMATICS, 2008, 9 (1)
  • [9] OCCAM RAZOR
    BLUMER, A
    EHRENFEUCHT, A
    HAUSSLER, D
    WARMUTH, MK
    [J]. INFORMATION PROCESSING LETTERS, 1987, 24 (06) : 377 - 380
  • [10] Prediction games and arcing algorithms
    Breiman, L
    [J]. NEURAL COMPUTATION, 1999, 11 (07) : 1493 - 1517