A dynamic ensemble learning algorithm for neural networks

被引:424
作者
Alam, Kazi Md Rokibul [1 ]
Siddique, Nazmul [2 ]
Adeli, Hojjat [3 ,4 ,5 ]
机构
[1] Khulna Univ Engn & Technol, Dept Comp Sci & Engn, Khulna 9203, Bangladesh
[2] Ulster Univ, Sch Comp Engn & Intelligent Syst, Derry BT48 7JL, Londonderry, North Ireland
[3] Ohio State Univ, Dept Neurosci, Columbus, OH 43210 USA
[4] Ohio State Univ, Dept Neurol, Columbus, OH 43210 USA
[5] Ohio State Univ, Dept Biomed Informat, Columbus, OH 43210 USA
关键词
Neural network ensemble; Backpropagation algorithm; Negative correlation learning; Constructive algorithms; Pruning algorithms; SELECTION; DIVERSITY; CLASSIFICATION; CLASSIFIERS; COMBINATION; DESIGN;
D O I
10.1007/s00521-019-04359-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel dynamic ensemble learning (DEL) algorithm for designing ensemble of neural networks (NNs). DEL algorithm determines the size of ensemble, the number of individual NNs employing a constructive strategy, the number of hidden nodes of individual NNs employing a constructive-pruning strategy, and different training samples for individual NN's learning. For diversity, negative correlation learning has been introduced and also variation of training samples has been made for individual NNs that provide better learning from the whole training samples. The major benefits of the proposed DEL compared to existing ensemble algorithms are (1) automatic design of ensemble; (2) maintaining accuracy and diversity of NNs at the same time; and (3) minimum number of parameters to be defined by user. DEL algorithm is applied to a set of real-world classification problems such as the cancer, diabetes, heart disease, thyroid, credit card, glass, gene, horse, letter recognition, mushroom, and soybean datasets. It has been confirmed by experimental results that DEL produces dynamic NN ensembles of appropriate architecture and diversity that demonstrate good generalization ability.
引用
收藏
页码:8675 / 8690
页数:16
相关论文
共 63 条
[51]   LEARNING REPRESENTATIONS BY BACK-PROPAGATING ERRORS [J].
RUMELHART, DE ;
HINTON, GE ;
WILLIAMS, RJ .
NATURE, 1986, 323 (6088) :533-536
[52]   A pruning algorithm for training cooperative neural network ensembles [J].
Shahjahan, M ;
Murase, K .
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2006, E89D (03) :1257-1269
[53]  
Sharkey A.J., 1996, Connection Science, V8, P299, DOI DOI 10.1080/095400996116785
[54]   Combining diverse neural nets [J].
Sharkey, AJC ;
Sharkey, NE .
KNOWLEDGE ENGINEERING REVIEW, 1997, 12 (03) :231-247
[55]   Training neural networks: Backpropagation vs genetic algorithms [J].
Siddique, MNH ;
Tokhi, MO .
IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, :2673-2678
[56]   A Cluster-Based Semisupervised Ensemble for Multiclass Classification [J].
Soares, Rodrigo G. F. ;
Chen, Huanhuan ;
Yao, Xin .
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2017, 1 (06) :408-420
[57]   Neural network systems for multi-dimensional temporal pattern classification [J].
Srinivasan, R ;
Wang, C ;
Ho, WK ;
Lim, KW .
COMPUTERS & CHEMICAL ENGINEERING, 2005, 29 (05) :965-981
[58]   An analysis of diversity measures [J].
Tang, E. K. ;
Suganthan, P. N. ;
Yao, X. .
MACHINE LEARNING, 2006, 65 (01) :247-271
[59]   Optimum design of cold-formed steel space structures using neural dynamics model [J].
Tashakori, A ;
Adeli, H .
JOURNAL OF CONSTRUCTIONAL STEEL RESEARCH, 2002, 58 (12) :1545-1566
[60]   Improved learning algorithm for two-layer neural networks for identification of nonlinear systems [J].
Vargas, Jose A. R. ;
Pedrycz, Witold ;
Hemerly, Elder M. .
NEUROCOMPUTING, 2019, 329 :86-96