Research on Ensemble Learning

被引:71
作者
Huang, Faliang [1 ]
Xie, Guoqing [1 ]
Xiao, Ruliang [1 ]
机构
[1] Fujian Normal Univ, Fac Software, Fuzhou, Peoples R China
来源
2009 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, VOL III, PROCEEDINGS | 2009年
关键词
machine learning; ensemble learning; adaboost; bagging;
D O I
10.1109/AICI.2009.235
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble learning is a powerful machine learning paradigm which has exhibited apparent advantages in many applications. An ensemble in the context of machine learning can be broadly defined as a machine learning system that is constructed with a set of individual models working in parallel and whose outputs are combined with a decision fusion strategy to produce a single answer for a given problem. In this paper we introduce core of ensemble learning and key techniques to improve ensemble learning. Based on this we describe the procedure of two typical algorithms, i.e., adaboost and bagging, in detail. Finally we testify the superiority in classification accuracy with some experiments.
引用
收藏
页码:249 / 252
页数:4
相关论文
共 6 条
[1]   Bagging predictors [J].
Breiman, L .
MACHINE LEARNING, 1996, 24 (02) :123-140
[2]  
Dietterich TG, 1997, AI MAG, V18, P97
[3]   A decision-theoretic generalization of on-line learning and an application to boosting [J].
Freund, Y ;
Schapire, RE .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 55 (01) :119-139
[4]   THE STRENGTH OF WEAK LEARNABILITY [J].
SCHAPIRE, RE .
MACHINE LEARNING, 1990, 5 (02) :197-227
[5]   Soft combination of neural classifiers: A comparative study [J].
Verikas, A ;
Lipnickas, A ;
Malmqvist, K ;
Bacauskiene, M ;
Gelzinis, A .
PATTERN RECOGNITION LETTERS, 1999, 20 (04) :429-444
[6]   Ensembling neural networks: Many could be better than all [J].
Zhou, ZH ;
Wu, JX ;
Tang, W .
ARTIFICIAL INTELLIGENCE, 2002, 137 (1-2) :239-263