Effective Neural Network Ensemble Approach for Improving Generalization Performance

被引:86
作者
Yang, Jing [1 ]
Zeng, Xiaoqin [1 ]
Zhong, Shuiming [2 ]
Wu, Shengli [3 ,4 ]
机构
[1] Hohai Univ, Inst Intelligence Sci & Technol, Nanjing 210098, Jiangsu, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing 210044, Jiangsu, Peoples R China
[3] Jiangsu Univ, Sch Comp Sci & Telecommun Technol, Zhenjiang 212013, Peoples R China
[4] Univ Ulster, Sch Comp & Math, Newtownabbey BT37 0QB, North Ireland
基金
中国国家自然科学基金;
关键词
Diversity ensemble learning; fusion; neural network ensemble; sensitivity; MULTILAYER PERCEPTRON; SENSITIVITY-ANALYSIS; VARIANCE ANALYSIS; I.I.D; INPUTS; DIVERSITY; COMBINATION; CLASSIFIERS; DESIGN; ERRORS;
D O I
10.1109/TNNLS.2013.2246578
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper, with an aim at improving neural networks' generalization performance, proposes an effective neural network ensemble approach with two novel ideas. One is to apply neural networks' output sensitivity as a measure to evaluate neural networks' output diversity at the inputs near training samples so as to be able to select diverse individuals from a pool of well-trained neural networks; the other is to employ a learning mechanism to assign complementary weights for the combination of the selected individuals. Experimental results show that the proposed approach could construct a neural network ensemble with better generalization performance than that of each individual in the ensemble combining with all the other individuals, and than that of the ensembles with simply averaged weights.
引用
收藏
页码:878 / 887
页数:10
相关论文
共 36 条
[1]  
Ali KM, 1996, MACH LEARN, V24, P173, DOI 10.1007/BF00058611
[2]   Improving the tolerance of multilayer perceptrons by minimizing the statistical sensitivity to weight deviations [J].
Bernier, JL ;
Ortega, J ;
Rojas, I ;
Prieto, A .
NEUROCOMPUTING, 2000, 31 (1-4) :87-103
[3]  
Blake C. L., 1998, Uci repository of machine learning databases
[4]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[5]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[6]  
Brown G., 2005, Information Fusion, V6, P5, DOI 10.1016/j.inffus.2004.04.004
[7]  
Chan P. P. K., 2007, P INT C SYST MAN CYB, P3051
[8]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[9]   BOOSTING AND OTHER ENSEMBLE METHODS [J].
DRUCKER, H ;
CORTES, C ;
JACKEL, LD ;
LECUN, Y ;
VAPNIK, V .
NEURAL COMPUTATION, 1994, 6 (06) :1289-1301
[10]  
Drucker H., 1997, Proceedings the 14th International Conference on Machine Learning, V97, P107, DOI DOI 10.5555/645526.657132