Multiple strong and balanced cluster-based ensemble of deep learners

被引:12
|
作者
Jan, Zohaib [1 ]
Verma, Brijesh [1 ]
机构
[1] Cent Queensland Univ, Ctr Intelligent Syst, Brisbane, Qld 4000, Australia
基金
澳大利亚研究理事会;
关键词
Deep learning; Ensemble classifier; Neural networks; Clustering; CLASSIFIER ENSEMBLES; RANDOM FORESTS; REGRESSION;
D O I
10.1016/j.patcog.2020.107420
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional Neural Networks (CNNs), also known as deep learners have seen much success in the last few years due to the availability of large amounts of data and high-performance computational resources. A CNN can be trained effectively if large amounts of data are available as it enables a CNN to find the optimal set of features and weights that can achieve the highest generalization performance. However, due to the requirement of large data size, CNNs require a lot of resources for example running time and computational resources to achieve a reasonable performance. Additionally, unbalanced data makes it difficult to train a CNN effectively that can achieve good generalization performance. In order to alleviate these limitations, in this paper, we propose a novel ensemble of deep learners that learns by combining multiple deep learners trained on small strongly class associated input data effectively. We propose a novel methodology of generating random subspace through clustering input data and propose a measure which can classify each cluster as a strong data cluster and a balanced data cluster. A methodology is also proposed that balances all strong data clusters in the pool so that an architecturally simple CNN can be trained on all balanced data clusters simultaneously. Classification decisions on all trained CNNs are then fused through majority voting to generate class decisions of the ensemble. The performance of the proposed ensemble approach is evaluated on UCI benchmark datasets, and results are compared with existing state-of-the-art ensemble approaches. Significance testing was conducted to further validate the efficacy of the results and a significance test analysis is presented. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] A Cluster-based Hierarchical Partitioning Approach for Multiple FPGAs
    Xiao, Chunhua
    Huang, Zhangqin
    Li, Da
    JOURNAL OF COMPUTERS, 2014, 9 (09) : 2173 - 2180
  • [22] A cluster-based and routing balanced P2P lookup protocol
    Lu, Yang
    Chen, Ming
    SNPD 2007: EIGHTH ACIS INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ARTIFICIAL INTELLIGENCE, NETWORKING, AND PARALLEL/DISTRIBUTED COMPUTING, VOL 1, PROCEEDINGS, 2007, : 646 - +
  • [23] Weighted Ensemble Models Are Strong Continual Learners
    Marouf, Imad Eddine
    Roy, Subhankar
    Tartaglione, Enzo
    Lathuiliere, Stephane
    COMPUTER VISION - ECCV 2024, PT LXXI, 2025, 15129 : 306 - 324
  • [24] Cluster-based Aggregate Load Forecasting with Deep Neural Networks
    Cini, Andrea
    Lukovic, Slobodan
    Alippi, Cesare
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [25] Cluster-Based One-Class Ensemble for Classification Problems in Information Retrieval
    Lipka, Nedim
    Stein, Benno
    Anderka, Maik
    SIGIR 2012: PROCEEDINGS OF THE 35TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2012, : 1041 - 1042
  • [26] Cluster-based ensemble learning model for improving sentiment classification of Arabic documents
    Al Mahmoud, Rana Husni
    Hammo, Bassam H.
    Faris, Hossam
    NATURAL LANGUAGE ENGINEERING, 2024, 30 (05) : 1091 - 1129
  • [27] Cluster-Based SJPDAFs for Classification and Tracking of Multiple Moving Objects
    Hatao, Naotaka
    Kagami, Satoshi
    FIELD AND SERVICE ROBOTICS, 2015, 105 : 303 - 317
  • [28] An ensemble deep learning model for classification of students as weak and strong learners via multiparametric analysis
    Kaur, Harjinder
    Kaur, Tarandeep
    Bhardwaj, Vivek
    Kumar, Mukesh
    DISCOVER APPLIED SCIENCES, 2024, 6 (11)
  • [29] Cluster-Based Boosting
    Miller, L. Dee
    Soh, Leen-Kiat
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (06) : 1491 - 1504
  • [30] Cluster-based selection
    Dunbar, JB
    PERSPECTIVES IN DRUG DISCOVERY AND DESIGN, 1997, 7-8 : 51 - 63