A non-specialized ensemble classifier using multi-objective optimization

被引:13
作者
Fletcher, Sam [1 ]
Verma, Brijesh [1 ]
Zhang, Mengjie [2 ]
机构
[1] Cent Queensland Univ, Ctr Intelligent Syst, Brisbane, Qld, Australia
[2] Victoria Univ Wellington, Evolutionary Computat Res Grp, Wellington, New Zealand
基金
澳大利亚研究理事会;
关键词
Ensemble classification; Multi-objective optimization; Genetic algorithm; Multiple classifiers; Classifier selection; Diversity; Double-fault measure; NONNEGATIVE MATRIX FACTORIZATION; PARTICLE SWARM OPTIMIZATION; FEATURE-SELECTION; DIVERSITY; REGRESSION; FRAMEWORK; PERFORMANCE; ACCURACY; SPARSITY; FUSION;
D O I
10.1016/j.neucom.2020.05.029
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble classification algorithms are often designed for data with certain properties, such as imbalanced class labels, a large number of attributes, or continuous data. While high-performing, these algorithms sacrifice performance when applied to data outside the targeted domain. We propose a non-specific ensemble classification algorithm that uses multi-objective optimization instead of relying on heuristics and fragile user-defined parameters. Only two user-defined parameters are included, with both being found to have large windows of values that produce statistically indistinguishable results, indicating the low level of expertise required from the user to achieve good results. Additionally, when given a large initial set of trained base-classifiers, we demonstrate that a multi-objective genetic algorithm aiming to optimize prediction accuracy and diversity will prefer particular types of classifiers over others. The total number of chosen classifiers is also surprisingly small - only 10.14 classifiers on average, out of an initial pool of 900. This occurs without any explicit preference for small ensembles of classifiers. Even with these small ensembles, significantly lower empirical classification error is achieved compared to the current state-of-the-art. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:93 / 102
页数:10
相关论文
共 89 条
[81]   A Comparative Analysis of Ensemble Classifiers: Case Studies in Genomics [J].
Whalen, Sean ;
Pandey, Gaurav .
2013 IEEE 13TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2013, :807-816
[82]   A survey of multiple classifier systems as hybrid systems [J].
Wozniak, Michal ;
Grana, Manuel ;
Corchado, Emilio .
INFORMATION FUSION, 2014, 16 :3-17
[83]   Dynamic classifier ensemble model for customer classification with imbalanced class distribution [J].
Xiao, Jin ;
Xie, Ling ;
He, Changzheng ;
Jiang, Xiaoyi .
EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (03) :3668-3675
[84]   Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective Approach [J].
Xue, Bing ;
Zhang, Mengjie ;
Browne, Will N. .
IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (06) :1656-1671
[85]   Hybrid Sampling-Based Clustering Ensemble With Global and Local Constitutions [J].
Yang, Yun ;
Jiang, Jianmin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (05) :952-965
[86]   Convex ensemble learning with sparsity and diversity [J].
Yin, Xu-Cheng ;
Huang, Kaizhu ;
Yang, Chun ;
Hao, Hong-Wei .
INFORMATION FUSION, 2014, 20 :49-59
[87]   A novel classifier ensemble method with sparsity and diversity [J].
Yin, Xu-Cheng ;
Huang, Kaizhu ;
Hao, Hong-Wei ;
Iqbal, Khalid ;
Wang, Zhi-Bin .
NEUROCOMPUTING, 2014, 134 :214-221
[88]   Oblique Decision Tree Ensemble via Multisurface Proximal Support Vector Machine [J].
Zhang, Le ;
Suganthan, Ponnuthurai N. .
IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (10) :2165-2176
[89]  
Zhou Z.-H., 2012, Ensemble Methods: Foundations and Algorithms, V1st