Learning ensemble classifiers via restricted Boltzmann machines

被引:27
作者
Zhang, Chun-Xia [1 ]
Zhang, Jiang-She [1 ]
Ji, Nan-Nan [1 ]
Guo, Gao [2 ]
机构
[1] Xi An Jiao Tong Univ, Fac Math & Stat, Inst Stat Decis & Machine Learning, Xian 710049, Shaanxi, Peoples R China
[2] Xian Univ Technol, Sch Sci, Dept Appl Math, Xian 710054, Shaanxi, Peoples R China
基金
高等学校博士学科点专项科研基金; 中国国家自然科学基金;
关键词
Ensemble classifier; Bagging; Restricted Boltzmann machine; Deep learning; Majority voting; Diversity; COMBINING CLASSIFIERS; CLASSIFICATION; ALGORITHMS;
D O I
10.1016/j.patrec.2013.10.009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, restricted Boltzmann machines (RBMs) have attracted considerable interest in machine learning field due to their strong ability to extract features. Given some training data, an RBM or a stack of several RBMs can be used to extract informative features. Meanwhile, ensemble learning is an active research area in machine learning owing to their potential to greatly increase the prediction accuracy of a single classifier. However, RBMs have not been studied to work with ensemble learning so far. In this study, we present several methods for integrating RBMs with bagging to generate diverse and accurate individual classifiers. Taking a classification tree as the base learning algorithm, a thoroughly experimental study conducted on 31 real-world data sets yields some promising conclusions. When using the features extracted by RBMs in ensemble learning, the best way is to perform model combination respectively on the original feature set and the one extracted by a single RBM. However, the prediction performance becomes worse when the features detected by a stack of 2 RBMs are also considered. As for the features detected by RBMs, good classification can be obtained only when they are used together with the original features. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:161 / 170
页数:10
相关论文
共 54 条
[1]  
[Anonymous], 2011, PMLR
[2]   An empirical comparison of voting classification algorithms: Bagging, boosting, and variants [J].
Bauer, E ;
Kohavi, R .
MACHINE LEARNING, 1999, 36 (1-2) :105-139
[3]   A design heuristic for hybrid classification ensembles in machine learning [J].
Baumgartner, Dustin ;
Serpen, Gursel .
INTELLIGENT DATA ANALYSIS, 2012, 16 (02) :233-246
[4]  
Bengio Yoshua, 2012, Neural Networks: Tricks of the Trade. Second Edition: LNCS 7700, P437, DOI 10.1007/978-3-642-35289-8_26
[5]  
Bengio Y., 2006, Advances in Neural Information Processing Systems, V19, DOI DOI 10.7551/MITPRESS/7503.003.0024
[6]   DETONATION CLASSIFICATION FROM ACOUSTIC SIGNATURE WITH THE RESTRICTED BOLTZMANN MACHINE [J].
Bengio, Yoshua ;
Chapados, Nicolas ;
Delalleau, Olivier ;
Larochelle, Hugo ;
Saint-Mleux, Xavier ;
Hudon, Christian ;
Louradour, Jerome .
COMPUTATIONAL INTELLIGENCE, 2012, 28 (02) :261-288
[7]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[8]   SmcHD1, containing a structural-maintenance-of-chromosomes hinge domain, has a critical role in X inactivation [J].
Blewitt, Marnie E. ;
Gendrel, Anne-Valerie ;
Pang, Zhenyi ;
Sparrow, Duncan B. ;
Whitelaw, Nadia ;
Craig, Jeffrey M. ;
Apedaile, Anwyn ;
Hilton, Douglas J. ;
Dunwoodie, Sally L. ;
Brockdorff, Neil ;
Kay, Graham F. ;
Whitelaw, Emma .
NATURE GENETICS, 2008, 40 (05) :663-669
[9]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[10]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32