Averaging regularized estimators

被引:43
作者
Taniguchi, M
Tresp, V
机构
[1] Siemens AG, Central Research
关键词
D O I
10.1162/neco.1997.9.5.1163
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We compare the performance of averaged regularized estimators. We show that the improvement in performance that can be achieved by averaging depends critically on the degree of regularization which is used in training the individual estimators. We compare four different averaging approaches: simple averaging, bagging, variance-based weighting, and variance-based bagging. In any of the averaging methods, the greatest degree of improvement-if compared to the individual estimators-is achieved if no or only a small degree of regularization is used. Here, variance-based weighting and variance-based bagging are superior to simple averaging or bagging. Our experiments indicate that better performance for both individual estimators and for averaging is achieved in combination with regularization. With increasing degrees of regularization, the two bagging-based approaches (bagging and variance-based bagging) outperform the individual estimators, simple averaging, and variance-based weighting. Bagging and variance-based bagging seem to be the overall best combining methods over a wide range of degrees of regularization.
引用
收藏
页码:1163 / 1178
页数:16
相关论文
共 11 条
[1]  
[Anonymous], 1993, THESIS BROWN U PROVI
[2]  
DICHTL H, 1995, 12 JW GEOTH U I KAP
[3]  
DREIMAN L, 1994, 421 U CAL BERK DEP S
[4]  
Efron B, 1994, INTRO BOOTSTRAP, DOI DOI 10.1201/9780429246593
[5]   METHODS FOR COMBINING EXPERTS PROBABILITY ASSESSMENTS [J].
JACOBS, RA .
NEURAL COMPUTATION, 1995, 7 (05) :867-888
[6]  
KROGH A, 1995, ADV NEURAL INFORMATI, V7
[7]  
MEIR R, 1995, ADV NEURAL INFORMATI, V7
[8]  
TANIGUCHI M, 1995, P INT S ART NEUR NET
[9]  
Tibshirani R., 1994, COMP SOME ERROR ESTI
[10]  
TRESP V, 1995, ADV NEURAL INFORMATI, V7