Boosting regression estimators

被引:73
作者
Avnimelech, R [1 ]
Intrator, N [1 ]
机构
[1] Tel Aviv Univ, Sackler Fac Exact Sci, Dept Comp Sci, IL-69978 Tel Aviv, Israel
关键词
D O I
10.1162/089976699300016746
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There is interest in extending the boosting algorithm (Schapire, 1990) to fit a wide range of regression problems. The threshold-based boosting algorithm for regression used an analogy between classification errors and big errors in regression. We focus on the practical aspects of this algorithm and compare it to other attempts to extend boosting to regression. The practical capabilities of this model are demonstrated on the laser data from the Santa Fe times-series competition and the Mackey-Glass time series, where the results surpass those of standard ensemble average.
引用
收藏
页码:499 / 520
页数:22
相关论文
共 22 条
  • [1] [Anonymous], 1993, ADV NEURAL INF PROCE
  • [2] Bagging predictors
    Breiman, L
    [J]. MACHINE LEARNING, 1996, 24 (02) : 123 - 140
  • [3] BREIMAN L, 1996, TR460 U CAL DEP STAT
  • [4] BREIMAN L, 1997, TR486 U CAL DEP STAT
  • [5] CROWDER S, 1990, CONNECTIONIST MODELS
  • [6] DRUCKER H, 1997, 14 INT C MACH LEARN
  • [7] Efron B., 1994, INTRO BOOTSTRAP, V57, DOI DOI 10.1201/9780429246593
  • [8] BOOSTING A WEAK LEARNING ALGORITHM BY MAJORITY
    FREUND, Y
    [J]. INFORMATION AND COMPUTATION, 1995, 121 (02) : 256 - 285
  • [9] FREUND Y, 1995, 2 EUR C COMP LEARN T
  • [10] NEURAL NETWORKS AND THE BIAS VARIANCE DILEMMA
    GEMAN, S
    BIENENSTOCK, E
    DOURSAT, R
    [J]. NEURAL COMPUTATION, 1992, 4 (01) : 1 - 58