Optimal linear combinations of neural networks

被引:319
作者
Hashem, S [1 ]
机构
[1] PACIFIC NW LAB, RICHLAND, WA USA
关键词
optimal linear combination; model selection; function approximation; collinearity; robust estimation; mixture of experts;
D O I
10.1016/S0893-6080(96)00098-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural network-based modeling often involves trying multiple networks with different architectures and training parameters in order to achieve acceptable model accuracy. Typically, one of the trained networks is chosen as best, while the rest are discarded. Hashem and Schmeiser (1995) proposed using optimal linear combinations of a number of trained neural networks instead of using a single best network. Combining the trained networks may help integrate the knowledge acquired by the components networks and thus improve model accuracy. In this paper, we extend the idea of optimal linear combinations (OLCs) of neural networks and discuss issues related to the generalization ability of the combined model. We then present two algorithms for selecting the component networks for the combination to improve the generalization ability of OLCs. Our experimental results demonstrate significant improvements in model accuracy, as a result of using OLCs, compared to using the apparent best network. (C) 1997 Elsevier Science Ltd.
引用
收藏
页码:599 / 614
页数:16
相关论文
共 57 条