Classifier Ensembling: Dataset Learning Using Bagging and Boosting

被引:0
|
作者
Lomte, Santosh S. [1 ]
Torambekar, Sanket G. [2 ]
机构
[1] Coll Engn Matoshri Pratishthan, Nanded, India
[2] Yogeshwari Polytech, Ambajogai, India
来源
COMPUTING AND NETWORK SUSTAINABILITY | 2019年 / 75卷
关键词
Classifiers; Bagging; Boosting; Meta-decision tree; Data mining; Ensemble; Errors; Learning; Training;
D O I
10.1007/978-981-13-7150-9_9
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Data mining is used to tackle the huge amount data which are preserved in the data warehouse and database, to chase desired information. Many data mining have been projected such as association rules, decision trees, neural networks, etc. from many years it takes its position in center. The efficiency of the real time or artificial is being enhanced using certain data mining techniques like bagging and boosting and AdaBoost by which the accuracy of the different classifiers is also efficient. Boosting and bagging are two widely used ensemble methods for classification. As the task is to increase the efficiency of the classifier, as per the past works it is always better to have combination of classifiers than to go for random guessing. From the available boosting techniques, the AdaBoost algorithm is one of the best techniques especially in the case when carrying out the branching type of tasks. In the following work, a classification technique is explained which considers the ensemble of classifiers for bagging and for boosting separately, where decision tree is used as the classifier for bagging and for boosting artificial neural network (ANN) is being used as the classifier. For the ensembling, the classifier MDT is being used. Sections are divided as Introduction, Literature Review, Problem Statement, Issues and Challenges, Research Methodology, and Analysis of the Work.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Adapting Bagging and Boosting to Learning Classifier Systems
    Liu, Yi
    Browne, Will N.
    Xue, Bing
    APPLICATIONS OF EVOLUTIONARY COMPUTATION, EVOAPPLICATIONS 2018, 2018, 10784 : 405 - 420
  • [2] Ensembling Heterogeneous Learning Models with Boosting
    Nascimento, Diego S. C.
    Coelho, Andre L. V.
    NEURAL INFORMATION PROCESSING, PT 1, PROCEEDINGS, 2009, 5863 : 512 - 519
  • [3] Ensembling Learning Based Melanoma Classification Using Gradient Boosting Decision Trees
    Han, Yipeng
    Zheng, Xiaolu
    AIPR 2020: 2020 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND PATTERN RECOGNITION, 2020, : 104 - 109
  • [4] Mammographic Classification Using Stacked Ensemble Learning with Bagging and Boosting Techniques
    Nirase Fathima Abubacker
    Ibrahim Abaker Targio Hashem
    Lim Kun Hui
    Journal of Medical and Biological Engineering, 2020, 40 : 908 - 916
  • [5] Mammographic Classification Using Stacked Ensemble Learning with Bagging and Boosting Techniques
    Abubacker, Nirase Fathima
    Hashem, Ibrahim Abaker Targio
    Hui, Lim Kun
    JOURNAL OF MEDICAL AND BIOLOGICAL ENGINEERING, 2020, 40 (06) : 908 - 916
  • [6] Handling Imbalanced Dataset in Multi-label Text Categorization using Bagging and Adaptive Boosting
    Winata, Genta Indra
    Khodra, Masayu Leylia
    5TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING AND INFORMATICS 2015, 2015, : 500 - 505
  • [7] Using boosting to prune bagging ensembles
    Martinez-Munoz, Gonzalo
    Suarez, Alberto
    PATTERN RECOGNITION LETTERS, 2007, 28 (01) : 156 - 165
  • [8] A Classifier Using Online Bagging Ensemble Method for Big Data Stream Learning
    Yanxia Lv
    Sancheng Peng
    Ying Yuan
    Cong Wang
    Pengfei Yin
    Jiemin Liu
    Cuirong Wang
    Tsinghua Science and Technology, 2019, (04) : 379 - 388
  • [9] A Classifier Using Online Bagging Ensemble Method for Big Data Stream Learning
    Yanxia Lv
    Sancheng Peng
    Ying Yuan
    Cong Wang
    Pengfei Yin
    Jiemin Liu
    Cuirong Wang
    Tsinghua Science and Technology, 2019, 24 (04) : 379 - 388
  • [10] A Classifier Using Online Bagging Ensemble Method for Big Data Stream Learning
    Lv, Yanxia
    Peng, Sancheng
    Yuan, Ying
    Wang, Cong
    Yin, Pengfei
    Liu, Jiemin
    Wang, Cuirong
    TSINGHUA SCIENCE AND TECHNOLOGY, 2019, 24 (04) : 379 - 388